WO2021093635A1 - Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible par ordinateur - Google Patents

Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2021093635A1
WO2021093635A1 PCT/CN2020/126122 CN2020126122W WO2021093635A1 WO 2021093635 A1 WO2021093635 A1 WO 2021093635A1 CN 2020126122 W CN2020126122 W CN 2020126122W WO 2021093635 A1 WO2021093635 A1 WO 2021093635A1
Authority
WO
WIPO (PCT)
Prior art keywords
phase difference
image
target
sub
brightness map
Prior art date
Application number
PCT/CN2020/126122
Other languages
English (en)
Chinese (zh)
Inventor
贾玉虎
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2021093635A1 publication Critical patent/WO2021093635A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Definitions

  • This application relates to the field of image processing technology, and in particular to an image processing method and device, electronic equipment, and computer-readable storage media.
  • the current focusing method is to focus within a rectangular frame.
  • the rectangular frame will include the foreground and background, and the focus can only be focused to a certain position to achieve quasi-focus.
  • the background is out of focus; when focusing on the background, the foreground is out of focus.
  • Traditional image processing methods have the problem of low image clarity.
  • an image processing method, apparatus, electronic device, and computer-readable storage medium are provided.
  • An image processing method applied to electronic equipment including:
  • Synthesis is performed according to the images corresponding to the phase difference of each target to obtain a fully in-focus image.
  • An image processing device characterized in that it comprises:
  • Preview image acquisition module for acquiring preview images
  • a dividing module configured to divide the preview image into at least two sub-areas
  • a phase difference acquiring module configured to acquire the phase difference corresponding to each of the at least two sub-regions
  • the phase difference acquisition module is further configured to determine at least two target phase differences from the phase difference corresponding to each sub-region, and the at least two target phase differences include a target foreground phase difference and a target background phase difference;
  • a focusing module configured to perform focusing according to the phase difference of each target to obtain an image corresponding to the phase difference of each target
  • the synthesis module is used for synthesizing the image corresponding to each target phase difference to obtain a fully in-focus image.
  • An electronic device includes a memory and a processor, and a computer program is stored in the memory.
  • the processor executes the following steps:
  • Synthesis is performed according to the images corresponding to the phase difference of each target to obtain a fully in-focus image.
  • a computer-readable storage medium having a computer program stored thereon, and when the computer program is executed by a processor, the following steps are implemented:
  • Synthesis is performed according to the images corresponding to the phase difference of each target to obtain a fully in-focus image.
  • the above-mentioned image processing method and device, electronic equipment, computer-readable storage medium obtaining a preview image, dividing the preview image into at least two sub-areas, obtaining the phase difference corresponding to each of the at least two sub-areas, and corresponding to each sub-areas
  • At least two target phase differences are determined in the phase difference, and the at least two target phase differences include the target foreground phase difference and the target background phase difference. Focus is performed according to each target phase difference to obtain an image corresponding to each target phase difference.
  • Acquire at least two images in different focal points, one of which is the background in-focus image and the other is the foreground in-focus image, and synthesize the images corresponding to the phase difference of each target to obtain a fully in-focus image, which can be out of focus
  • the image with less area improves the sharpness of the image.
  • Fig. 1 is an application environment diagram of an image processing method in an embodiment.
  • Fig. 2 is a flowchart of an image processing method in an embodiment.
  • Fig. 3 is a schematic diagram of the principle of phase focusing in an embodiment.
  • FIG. 4 is a schematic diagram of phase detection pixel points arranged in pairs in the pixel points included in the image sensor in an embodiment.
  • Fig. 5 is a schematic diagram of a part of the structure of an electronic device in an embodiment.
  • FIG. 6 is a schematic structural diagram of a part of the image sensor 504 in an embodiment.
  • FIG. 7 is a schematic diagram of the structure of pixels in an embodiment.
  • FIG. 8 is a schematic diagram of the internal structure of an image sensor in an embodiment.
  • FIG. 9 is a schematic diagram of the pixel point group Z in an embodiment.
  • FIG. 10 is a schematic diagram of a process of obtaining the phase difference corresponding to each sub-region in an embodiment.
  • FIG. 11 is a schematic diagram of performing segmentation processing on the target brightness map in the first direction in an embodiment.
  • FIG. 12 is a schematic diagram of performing segmentation processing on the target brightness map in the second direction in an embodiment.
  • Fig. 13 is a schematic flow chart of synthesizing to obtain a full in-focus image in an embodiment.
  • Fig. 14 is a schematic flow chart of synthesizing to obtain a full in-focus image in another embodiment.
  • Fig. 15 is a schematic flow chart of synthesizing a full in-focus image in another embodiment.
  • Fig. 16 is a structural block diagram of an image processing apparatus according to an embodiment.
  • Fig. 17 is a schematic diagram of the internal structure of an electronic device in an embodiment.
  • first and second used in this application can be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish the first data from another.
  • the first average phase difference may be referred to as the second average phase difference, and similarly, the second average phase difference may be referred to as the first average phase difference.
  • the first mean value of phase difference and the second mean value of phase difference are both mean value of phase difference, but they are not the same mean value of phase difference.
  • the first image feature can be referred to as the second image feature, and similarly, the second image feature can be referred to as the first image feature. Both the first image feature and the second image feature are image features, but they are not the same image feature.
  • the embodiment of the present application provides an electronic device.
  • the electronic device can be any terminal device including mobile phone, tablet computer, PDA (Personal Digital Assistant), POS (Point of Sales), on-board computer, wearable device, etc. Take the electronic device as a mobile phone as an example .
  • the above-mentioned electronic equipment includes an image processing circuit, which can be implemented by hardware and/or software components, and can include various processing units that define an ISP (Image Signal Processing, image signal processing) pipeline.
  • Fig. 1 is a schematic diagram of an image processing circuit in an embodiment. As shown in FIG. 1, for ease of description, only various aspects of the image processing technology related to the embodiments of the present application are shown.
  • the image processing circuit includes an ISP processor 140 and a control logic 150.
  • the image data captured by the imaging device 110 is first processed by the ISP processor 140, and the ISP processor 140 analyzes the image data to capture image statistics that can be used to determine and/or one or more control parameters of the imaging device 110.
  • the imaging device 110 may include a camera having one or more lenses 112 and an image sensor 114.
  • the image sensor 114 may include a color filter array (such as a Bayer filter). The image sensor 114 may obtain the light intensity and wavelength information captured by each imaging pixel of the image sensor 114, and provide a set of raw materials that can be processed by the ISP processor 140. Image data.
  • the attitude sensor 120 (such as a three-axis gyroscope, a Hall sensor, and an accelerometer) can provide the collected image processing parameters (such as anti-shake parameters) to the ISP processor 140 based on the interface type of the attitude sensor 120.
  • the interface of the attitude sensor 120 may use an SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the foregoing interfaces.
  • the image sensor 114 may also send the original image data to the posture sensor 120.
  • the sensor 120 can provide the original image data to the ISP processor 140 based on the interface type of the posture sensor 120, or the posture sensor 120 can store the original image data in the image memory 130. in.
  • the ISP processor 140 processes the original image data pixel by pixel in a variety of formats.
  • each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 140 may perform one or more image processing operations on the original image data, and collect statistical information about the image data. Among them, the image processing operations can be performed with the same or different bit depth accuracy.
  • the ISP processor 140 may also receive image data from the image memory 130.
  • the posture sensor 120 interface sends the original image data to the image memory 130, and the original image data in the image memory 130 is then provided to the ISP processor 140 for processing.
  • the image memory 130 may be a part of a memory device, a storage device, or an independent dedicated memory in an electronic device, and may include DMA (Direct Memory Access) features.
  • the ISP processor 140 may perform one or more image processing operations, such as temporal filtering.
  • the processed image data can be sent to the image memory 130 for additional processing before being displayed.
  • the ISP processor 140 receives the processed data from the image memory 130, and performs image data processing in the original domain and in the RGB and YCbCr color spaces on the processed data.
  • the image data processed by the ISP processor 140 may be output to the display 160 for viewing by the user and/or further processed by a graphics engine or a GPU (Graphics Processing Unit, graphics processor).
  • the output of the ISP processor 140 can also be sent to the image memory 130, and the display 160 can read image data from the image memory 130.
  • the image memory 130 may be configured to implement one or more frame buffers.
  • the statistical data determined by the ISP processor 140 may be sent to the control logic 150 unit.
  • the statistical data may include image sensor 114 statistical information such as the vibration frequency of the gyroscope, automatic exposure, automatic white balance, automatic focus, flicker detection, black level compensation, and lens 112 shadow correction.
  • the control logic 150 may include a processor and/or a microcontroller that executes one or more routines (such as firmware). The one or more routines can determine the control parameters and ISP processing of the imaging device 110 based on the received statistical data. The control parameters of the device 140.
  • control parameters of the imaging device 110 may include attitude sensor 120 control parameters (such as gain, integration time of exposure control, anti-shake parameters, etc.), camera flash control parameters, camera anti-shake displacement parameters, lens 112 control parameters (such as focus or Zoom focal length), or a combination of these parameters.
  • the ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (for example, during RGB processing), and lens 112 shading correction parameters.
  • the image sensor 114 in the imaging device may include a plurality of pixel point groups arranged in an array, wherein each pixel point group includes a plurality of pixel points arranged in an array, and each pixel point includes Multiple sub-pixels arranged in an array.
  • the first image is acquired through the lens 112 and the image sensor 114 in the imaging device (camera) 110, and the first image is sent to the ISP processor 140.
  • the ISP processor 140 can perform subject detection on the first image to obtain the region of interest in the first image, or obtain the region selected by the user as the region of interest, or obtain it in other ways
  • the area of interest is not limited to this.
  • the ISP processor 140 is configured to obtain a preview image, divide the preview image into at least two sub-areas, obtain a phase difference corresponding to each of the at least two sub-areas, and determine at least two target phase differences according to the phase difference corresponding to each sub-area ,
  • the at least two target phase differences include the target foreground phase difference and the target background phase difference, focus is performed according to each target phase difference, and an image corresponding to each target phase difference is obtained, and the image corresponding to each target phase difference is synthesized, Obtain a fully in-focus image.
  • the ISP processor 140 may send relevant information of the target sub-region, such as location information, contour information, etc., to the control logic 150.
  • control logic 150 controls the lens 112 in the imaging device (camera) to move, so as to focus on the position in the actual scene corresponding to the target area.
  • Fig. 2 is a flowchart of an image processing method in an embodiment. As shown in FIG. 2, an image processing method applied to an electronic device includes operation 202 to operation 212.
  • a preview image is obtained.
  • the number of cameras of the electronic device is not limited. For example, it may be one or two... and it is not limited to this.
  • the form of the camera installed in the electronic device is not limited. For example, it can be a camera built into the electronic device, or it can be an external camera of the electronic device. It can be a front camera or a rear camera.
  • the camera on the electronic device can be any type of camera.
  • the camera may be a color camera, a black-and-white camera, a depth camera, a telephoto camera, a wide-angle camera, etc., but is not limited thereto.
  • the preview image may be a visible light image.
  • the preview image refers to the image presented on the screen of the electronic device when the camera is not shooting.
  • the preview image can be the preview image of the current frame.
  • the electronic device obtains a preview image through a camera and displays it on the display screen.
  • the preview image is divided into at least two sub-areas.
  • the sub-region refers to an image region in the preview image.
  • the sub-region is a part of the image. That is, the sub-region includes a part of the pixels of the preview image.
  • the size and shape of each sub-region obtained by dividing the preview image may be the same or different, and one of them may be the same and the other may be different.
  • the specific division method is not limited.
  • the electronic device divides the preview image into at least two sub-areas.
  • the electronic device may divide the preview image into M ⁇ N sub-areas. Both N and M are positive integers, and the values of N and M may be the same or different.
  • the pixel of the preview image is 100 ⁇ 100, and it is divided into 4 sub-regions, then the pixels of each sub-region are 25 ⁇ 25.
  • Operation 206 Obtain a phase difference corresponding to each of the at least two sub-regions.
  • the phase difference refers to the difference in the position of the image formed by the imaging light entering the lens from different directions in the image sensor.
  • the electronic device includes an image sensor
  • the image sensor may include a plurality of pixel point groups arranged in an array, each pixel point group includes M*N pixel points arranged in an array; each pixel point corresponds to a photosensitive unit, Among them, both M and N are natural numbers greater than or equal to 2.
  • the phase difference corresponding to each sub-region may include the first phase difference and the second phase difference.
  • the first direction corresponding to the first phase difference and the second direction corresponding to the second phase difference form a preset angle.
  • the preset included angle may be any included angle other than 0 degrees, 180 degrees, and 360 degrees. That is, the phase difference corresponding to each sub-region can be two.
  • the electronic device acquires the credibility of the first phase difference and the credibility of the second phase difference; determines the credibility of the credibility of the second phase difference and the credibility of the credibility of the second phase difference;
  • the phase difference with higher reliability is used as the phase difference corresponding to the sub-region.
  • phase detection pixels in order to perform phase detection auto-focusing, usually some phase detection pixels can be arranged in pairs in the pixels included in the image sensor, which can also be called shielded pixels, where each phase detection pixel is paired One phase detection pixel point is occluded on the left side, and the other phase detection pixel point is occluded on the right side.
  • the imaging beam directed at each phase detection pixel point pair can be separated into two parts, left and right, by comparing the left and right parts.
  • the phase difference corresponding to each sub-window can be obtained from the image formed by the two imaging beams.
  • At least two target phase differences are determined from the phase differences corresponding to each sub-region, and the at least two target phase differences include the target foreground phase difference and the target background phase difference.
  • the foreground refers to the part with smaller depth in the image.
  • the foreground contains the subject.
  • the foreground is generally the object that the user wants to focus on.
  • the at least two target phase differences include the foreground phase difference and the background phase difference, and may also include other phase differences. For example, the phase difference between the foreground and the background.
  • the electronic device determines at least two target phase differences from the phase differences corresponding to each area, and the at least two target phase differences include at least the target foreground phase difference and the target background phase difference.
  • operation 210 focusing is performed according to the phase difference of each target, and an image corresponding to the phase difference of each target is obtained.
  • focusing refers to the process of changing the distance and distance of the object through the focusing mechanism of the electronic device to make the image of the object clear.
  • Focus can refer to auto focus.
  • Auto focus may refer to Phase Detection Auto Focus (PDAF) and other auto focus methods combined with phase focus.
  • Phase focusing is to obtain the phase difference through the sensor, calculate the defocus value according to the phase difference, and control the lens to move the corresponding distance according to the defocus value to achieve focus.
  • Phase focus can be combined with other focus methods, such as continuous auto focus, laser focus, etc.
  • the electronic device when receiving a photographing instruction, performs focusing according to each target phase difference of the at least two target phase differences, and obtains an image corresponding to each target phase difference.
  • the electronic device can calculate the defocus value corresponding to each target phase difference according to each target phase difference of at least two target phase differences, and control the lens to move the corresponding distance according to each defocus value to obtain the corresponding distance of each target phase difference.
  • image refers to the distance between the current position of the image sensor and the position where the image sensor should be in the in-focus state.
  • Each phase difference has a corresponding defocus value.
  • the defocus value corresponding to each phase difference can be the same or different.
  • the relationship between the phase difference and the defocus value can be obtained by pre-calibration.
  • the relationship between the phase difference and the defocus value may be a linear relationship or a nonlinear relationship.
  • the at least two target phase differences include target phase difference A, target phase difference B, and target phase difference C.
  • the target phase difference A is the target foreground phase difference
  • the target phase difference C is the target background phase difference
  • the target phase difference B is the phase difference between the target phase difference A and the target phase difference C.
  • the defocus value A is calculated according to the target phase difference A, and the lens is moved by the corresponding distance according to the defocus value A to obtain the image A corresponding to the target phase difference A.
  • the defocus value B is calculated according to the target phase difference B, and the lens is controlled to move the corresponding distance according to the defocus value B to obtain the image B corresponding to the target phase difference B.
  • the defocus value C is calculated according to the target phase difference C, and the lens is moved by the corresponding distance according to the defocus value C to obtain the image C corresponding to the target phase difference C. Then the electronic device gets image A, image B, and image C.
  • the processing order of the target phase difference is not limited.
  • the image corresponding to each target phase difference is synthesized to obtain a fully in-focus image.
  • the fully in-focus image refers to an image in which there is no out-of-focus area theoretically.
  • Image stitching refers to combining several images, which can be obtained by focusing at different positions, or images corresponding to different phase differences, to form a seamless panoramic image or high-resolution image.
  • the electronic device may stitch and synthesize the clear parts of the image corresponding to each target phase difference to obtain a fully in-focus image.
  • the electronic device may use the Laplacian pyramid method to perform the process according to the image corresponding to the phase difference of each target to obtain a fully in-focus image.
  • the electronic device inputs the image corresponding to the phase difference of each target into the convolutional neural network model for synthesis to obtain a fully in-focus image and the like is not limited to this.
  • a preview image is obtained, the preview image is divided into at least two sub-regions, the phase difference corresponding to each sub-region in the at least two sub-regions is obtained, and at least two are determined from the phase difference corresponding to each sub-region.
  • Target phase difference, at least two target phase differences include the target foreground phase difference and the target background phase difference.
  • Focusing is performed according to the phase difference of each target to obtain an image corresponding to the phase difference of each target, and at least two targets at different focal points can be acquired.
  • One is the background in-focus image, and the other is the foreground in-focus image.
  • the images corresponding to the phase difference of each target are synthesized to obtain a fully in-focus image, which can obtain an image with less out-of-focus area and improve the image The clarity.
  • FIG. 3 is a schematic diagram of the principle of phase focusing in an embodiment.
  • M1 is the position of the image sensor when the electronic device is in the in-focus state.
  • the in-focus state refers to the state of successful focusing.
  • Figure 3 When the image sensor is in the M1 position, the object W is reflected toward the lens Lens The imaging light g in different directions of the image converges on the image sensor, that is, the imaging light g in different directions reflected by the object W toward the lens Lens is imaged at the same position on the image sensor. At this time, the image of the image sensor is clear .
  • M2 and M3 are the possible positions of the image sensor when the electronic device is not in focus.
  • the image sensor when the image sensor is at the M2 position or the M3 position, the object W is reflected toward the lens Lens in different directions.
  • the imaging light g will be imaged at different positions.
  • the imaging light g reflected by the object W in different directions to the lens Lens is imaged at the position A and the position B respectively.
  • the image sensor is at the M3 position, the object W is reflected toward The imaging rays g in different directions of the lens Lens are respectively imaged at the position C and the position D. At this time, the image of the image sensor is not clear.
  • the difference in position of the image formed by the imaging light entering the lens from different directions in the image sensor can be obtained.
  • the difference between position A and position B can be obtained, or, Obtain the difference between position C and position D; after obtaining the difference in position of the image formed by the imaging light entering the lens from different directions in the image sensor, the difference and the difference between the lens and the image sensor in the camera
  • the geometric relationship is used to obtain the defocus value.
  • the so-called defocus value refers to the distance between the current position of the image sensor and the position where the image sensor should be in the in-focus state; the electronic device can focus according to the obtained defocus value.
  • the "difference in the position of the image formed by the imaging light entering the lens from different directions on the image sensor" can generally be referred to as a phase difference.
  • obtaining the phase difference is a very critical technical link.
  • the phase difference can be applied to a variety of different scenes, and the focus scene is only a relatively possible scene.
  • the phase difference can be applied to the acquisition scene of the depth map, that is, the phase difference can be used to acquire the depth map; for another example, the phase difference can be used in the reconstruction scene of the three-dimensional image, that is, you can use
  • the phase difference realizes the reconstruction of the three-dimensional image.
  • the embodiment of the present application aims to provide a method for obtaining the phase difference. As for the scene to which the phase difference is applied after the phase difference is obtained, the embodiment of the present application does not specifically limit it.
  • FIG. 4 is a schematic diagram of phase detection pixel points arranged in pairs in the pixel points included in the image sensor in an embodiment.
  • a phase detection pixel point pair (hereinafter referred to as a pixel point pair) A, a pixel point pair B, and a pixel point pair C may be provided in the image sensor.
  • a phase detection pixel point pair (hereinafter referred to as a pixel point pair) A, a pixel point pair B, and a pixel point pair C may be provided in the image sensor.
  • a phase detection pixel point pair (hereinafter referred to as a pixel point pair) A
  • a pixel point pair B a pixel point pair B
  • a pixel point pair C may be provided in the image sensor.
  • one phase detection pixel is subjected to left shielding (English: Left Shield)
  • the other phase detection pixel is subjected to right shielding (English: Right Shield).
  • the imaging beam can be divided into left and right parts, and the phase difference can be obtained by comparing the images formed by the left and right parts of the imaging beam.
  • the focusing method using the Figure 4 method is to obtain the phase difference through the sensor, calculate the defocus value according to the phase difference, control the lens movement according to the defocus value, and then find the focus value (FV) peak value.
  • phase difference corresponding to each of the at least two sub-regions is divided into a foreground phase difference set and a background phase difference set.
  • the foreground phase difference set includes at least one foreground phase difference.
  • the background phase difference set includes at least one background phase difference.
  • the phase difference threshold may be stored in the electronic device.
  • the phase difference greater than the phase difference threshold is divided into a background phase difference set, and the phase difference less than or equal to the phase difference threshold is divided into a foreground phase difference set.
  • the electronic device calculates the median phase difference according to the phase difference corresponding to each sub-region.
  • the phase difference greater than the median of the phase difference is divided into the background phase difference set, and the phase difference less than or equal to the median of the phase difference is divided into the foreground phase difference set.
  • Operation (a2) is to obtain the first mean value of the phase difference corresponding to the foreground phase difference set.
  • the electronic device obtains an average value according to the phase difference in the foreground phase difference set to obtain the first average value of the phase difference.
  • Operation (a3) is to obtain the second mean value of phase difference corresponding to the background phase difference set.
  • the electronic device obtains the average value according to the phase difference in the background phase difference set to obtain the second average value of the phase difference.
  • the first average value of the phase difference is used as the target foreground phase difference.
  • the electronic device uses the first average value of phase difference as the target foreground phase difference.
  • the corresponding first defocus value is calculated according to the first average value of phase difference, and the lens is controlled to move the corresponding distance according to the first defocus value to obtain The image corresponding to the first mean value of the phase difference.
  • the area corresponding to the first average value of phase difference is used as the focus area for focusing, and an image corresponding to the first average value of phase difference is obtained.
  • the second average phase difference is used as the target background phase difference.
  • the electronic device uses the second average value of the phase difference as the target foreground phase difference.
  • the corresponding second defocus value is calculated according to the second average value of phase difference, and the lens is controlled to move the corresponding distance according to the second defocus value to obtain The image corresponding to the second mean value of the phase difference.
  • the area corresponding to the second average value of phase difference is used as the focus area for focusing to obtain an image corresponding to the second average value of phase difference.
  • the phase difference corresponding to each of the at least two sub-regions is divided into a foreground phase difference set and a background background phase difference set, and the first phase difference average value corresponding to the foreground phase difference set is obtained, and
  • the second average value of phase difference corresponding to the background phase difference set of the background, the first average value of phase difference is taken as the target foreground phase difference, and the second average value of phase difference is taken as the target background phase difference
  • the foreground in-focus image and background in-focus image can be obtained according to the average value Image, improve the clarity of the image.
  • the image processing method further includes: excluding the maximum phase difference among the phase differences corresponding to the sub-regions to obtain a residual phase difference set; and dividing the phase difference corresponding to each sub-region in the at least two sub-regions into the foreground phase difference
  • the set and the background phase difference set include: dividing the remaining phase difference set into a foreground phase difference set and a background background phase difference set.
  • the area corresponding to the largest phase difference among the phase differences corresponding to the sub-regions is the area corresponding to the farthest scene in the preview image. Excluding the maximum phase difference among the phase differences corresponding to the sub-regions is to exclude the phase difference corresponding to the farthest scene in the preview image.
  • the phase difference threshold can be stored in the electronic device.
  • the phase difference that is greater than the phase difference threshold in the remaining phase difference set is divided into a background phase difference set, and the phase difference that is less than or equal to the phase difference threshold is divided into a foreground phase difference set.
  • the electronic device calculates the median phase difference according to the phase difference in the remaining phase difference set.
  • the phase difference greater than the median of the phase difference is divided into the background phase difference set, and the phase difference less than or equal to the median of the phase difference is divided into the foreground phase difference set.
  • the electronic device calculates the average value of the phase difference according to the phase difference in the remaining phase difference set.
  • the phase difference greater than the average phase difference is divided into a background phase difference set, and the phase difference less than or equal to the average phase difference is divided into a foreground phase difference set.
  • the largest phase difference in the phase difference corresponding to the sub-region is excluded to obtain the remaining phase difference set, which can eliminate the farthest background and reduce the remaining phase difference.
  • the set is divided into a foreground phase difference set and a background background phase difference set. Focusing based on the average value can improve the clarity of the image.
  • determining at least two target phase differences from the phase differences corresponding to each sub-region, and the at least two target phase differences include the foreground phase difference and the background phase difference, including: obtaining the phase difference of the at least two sub-regions The maximum phase difference and the minimum phase difference; the minimum phase difference is regarded as the foreground phase difference; the maximum phase difference is regarded as the background phase difference.
  • the area corresponding to the largest phase difference is the area corresponding to the most distant object.
  • the area corresponding to the smallest phase difference is the area corresponding to the nearest scene. There is usually a target subject in the area corresponding to the smallest phase difference.
  • the image processing method in this embodiment obtains the maximum phase difference and the minimum phase difference among the phase differences of at least two sub-regions, and uses the minimum phase difference as the foreground phase difference and the maximum phase difference as the background phase difference, so that only two The image is synthesized to improve image processing efficiency while improving image clarity.
  • the electronic device includes an image sensor, the image sensor includes a plurality of pixel point groups arranged in an array, and each pixel point group includes M*N pixel points arranged in an array; each pixel point corresponds to a photosensitive unit , Where both M and N are natural numbers greater than or equal to 2.
  • FIG. 5 is a schematic diagram of a part of the structure of an electronic device in an embodiment.
  • the electronic device may include a lens 502 and an image sensor 504, where the lens 502 may be composed of a series of lenses, and the image sensor 504 may be a metal oxide semiconductor device (English: Complementary Metal Oxide Semiconductor; abbreviation: CMOS) ) Image sensor, charge-coupled device (English: Charge-coupled Device; abbreviation: CCD), quantum thin film sensor or organic sensor, etc.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge-coupled Device
  • quantum thin film sensor or organic sensor
  • the image sensor 504 may include a plurality of pixel point groups Z arranged in an array, wherein each pixel point group Z includes There are a plurality of pixel points D arranged in an array, and each pixel point includes a plurality of sub-pixel points d arranged in an array.
  • each pixel point group Z may include 4 pixels D arranged in an array arrangement of two rows and two columns, and each pixel point may include an array of two rows and two columns. 4 sub-pixel points d arranged in an arrangement manner.
  • the pixel points included in the image sensor 504 refer to a photosensitive unit, which may be composed of a plurality of photosensitive elements (that is, sub-pixels) arranged in an array, wherein the photosensitive element is a kind of An element that converts optical signals into electrical signals.
  • the photosensitive unit may further include a microlens and a filter, etc., wherein the microlens is disposed on the filter, the filter is disposed on each photosensitive element included in the photosensitive unit, and the filter may include red , Green, and blue, which can only transmit light of the corresponding wavelengths of red, green, and blue respectively.
  • FIG. 7 is a schematic diagram of the structure of pixels in an embodiment. As shown in Figure 7, taking each pixel point including sub-pixel point 1, sub-pixel point 2, sub-pixel point 3, and sub-pixel point 4 as an example, sub-pixel point 1 and sub-pixel point 2 can be combined, and sub-pixel point 3 and sub-pixel point Pixel 4 is synthesized to form a pair of PD pixels in the vertical direction to obtain the phase difference in the vertical direction, which can detect horizontal edges; sub-pixel point 1 and sub-pixel point 3 are synthesized, and sub-pixel point 2 and sub-pixel point 4 are synthesized to form a left-right direction.
  • the PD pixel pair obtains the phase difference in the horizontal direction and can detect the vertical edge.
  • FIG. 8 is a schematic diagram of the internal structure of an image sensor in an embodiment.
  • the imaging device includes a lens and an image sensor.
  • the image sensor includes a micro lens 80, a filter 82 and a photosensitive unit 84.
  • the micro lens 80, the filter 82 and the photosensitive unit 84 are sequentially located on the incident light path, that is, the micro lens 80 is disposed on the filter 82, and the filter 82 is disposed on the photosensitive unit 84.
  • the filter 82 may include three types of red, green, and blue, and can only transmit light of corresponding wavelengths of red, green, and blue, respectively.
  • One filter 82 is arranged on one pixel point.
  • the micro lens 80 is used to receive incident light and transmit the incident light to the filter 82. After the filter 82 smoothes the incident light, the smoothed light is incident on the photosensitive unit 84 on a pixel basis.
  • the photosensitive unit 84 in the image sensor converts the light incident from the filter 82 into a charge signal through the photoelectric effect, and generates a pixel signal consistent with the charge signal.
  • the charge signal is consistent with the received light intensity.
  • the pixels included in the image sensor and the pixels included in the image are two different concepts.
  • the pixels included in the image refer to the smallest component unit of the image, which is generally represented by a sequence of numbers.
  • the sequence of numbers can be referred to as the pixel value of a pixel.
  • the embodiments of the present application involve both concepts of "pixels included in an image sensor" and "pixels included in an image”. To facilitate readers' understanding, a brief explanation is provided here.
  • FIG. 9 shows a schematic diagram of an exemplary pixel point group Z.
  • the pixel point group Z includes 4 pixels arranged in an array arrangement of two rows and two columns. D, where the color channel of the pixels in the first row and the first column is green, that is, the filters included in the pixels in the first row and the first column are green filters, and the pixels in the first row and the second column are green filters.
  • the color channel of the pixel is red, that is, the filter included in the pixel in the first row and second column is a red filter, and the color channel of the pixel in the second row and first column is blue, that is, Yes, the pixel in the second row and first column includes a blue filter, and the color channel of the pixel in the second row and second column is green, that is, the pixel in the second row and second column
  • the filter included in the dot is a green filter.
  • the electronic device includes an image sensor, and the image sensor includes a plurality of pixel point groups arranged in an array, and each pixel point group includes a plurality of pixel points arranged in an array.
  • FIG. 10 it is a schematic diagram of the process of obtaining the phase difference corresponding to each sub-region in an embodiment, and obtaining the phase difference corresponding to each sub-region in at least two sub-regions includes:
  • a target brightness map is obtained according to the brightness values of the pixel points included in each pixel point group.
  • the brightness value of the pixel of the image sensor can be characterized by the brightness value of the sub-pixel included in the pixel.
  • the electronic device can obtain the target brightness map according to the brightness values of the sub-pixels in the pixel points included in each pixel point group.
  • the "brightness value of a sub-pixel” refers to the brightness value of the light signal received by the sub-pixel.
  • the sub-pixel included in the image sensor is a photosensitive element that can convert light signals into electrical signals. Therefore, the electronic device can obtain the intensity of the light signal received by the sub-pixel according to the electrical signal output by the sub-pixel, and obtain the brightness value of the sub-pixel according to the intensity of the light signal received by the sub-pixel.
  • the target brightness map in the embodiment of the present application is used to reflect the brightness value of the sub-pixels in the image sensor.
  • the target brightness map may include multiple pixels, wherein the pixel value of each pixel in the target brightness map is based on the image sensor Obtained from the brightness value of the neutron pixel.
  • segmentation processing is performed on the target brightness map, and the first segmented brightness map and the second segmented brightness map are obtained according to the results of the segmentation processing.
  • the electronic device may perform segmentation processing on the target brightness map along the direction of the column (the y-axis direction in the image coordinate system).
  • the first segmented brightness map and the second segmented brightness map obtained after the target brightness map is segmented along the column direction can be called the left image and the right image, respectively.
  • the electronic device may perform segmentation processing on the target brightness map along the row direction (the x-axis direction in the image coordinate system).
  • the first segmented brightness map and the second segmented brightness map obtained after the target brightness map is segmented in the direction of the row can be referred to as the upper image and the lower image, respectively.
  • the phase difference of the pixels that match each other is determined according to the position difference of the pixels that match each other in the first split brightness map and the second split brightness map.
  • the obtained first brightness segmentation map and the second brightness segmentation map are the upper and lower images.
  • the electronic device obtains the vertical phase difference according to the position difference of the matching pixels in the first brightness segmentation map and the second brightness segmentation map.
  • the obtained first brightness segmentation map and the second brightness segmentation map are the left and right images. Then, the electronic device obtains the horizontal phase difference according to the position difference of the matching pixels in the first brightness segmentation map and the second brightness segmentation map.
  • mutant pixels means that the pixel matrix composed of the pixel itself and the surrounding pixels are similar to each other.
  • the pixel a and its surrounding pixels in the first segmented brightness map form a pixel matrix with 3 rows and 3 columns, and the pixel value of the pixel matrix is:
  • the pixel b and the surrounding pixels in the second segmented brightness map also form a pixel matrix with 3 rows and 3 columns, and the pixel value of the pixel matrix is:
  • the two matrices are similar, and it can be considered that the pixel a and the pixel b match each other.
  • a common method is to calculate the difference of the pixel value of each corresponding pixel in the two pixel matrices, and then calculate the difference.
  • the absolute value of is added, and the result of the addition is used to determine whether the pixel matrix is similar, that is, if the result of the addition is less than a preset threshold, the pixel matrix is considered similar, otherwise, the pixel matrix is considered not similar.
  • the difference of 1 and 2 the difference of 10 and 10
  • the difference of 90 and 90 ...
  • the absolute difference Values are added, and the result of the addition is 3. If the result of the addition of 3 is less than the preset threshold, it is considered that the two pixel matrices with 3 rows and 3 columns are similar.
  • Another common method for judging whether the pixel matrix is similar is to use the Sobel convolution kernel calculation method or the high Laplacian calculation method to extract the edge characteristics, and judge whether the pixel matrix is similar by the edge characteristics.
  • the positional difference of pixels that match each other refers to the difference between the positions of the pixels in the first split brightness map and the positions of the pixels in the second split brightness map among the matched pixels.
  • the position difference between the pixel a and the pixel b that are matched with each other refers to the difference between the position of the pixel a in the first split brightness map and the position of the pixel b in the second split brightness map.
  • the pixels that match each other correspond to different images in the image sensor formed by the imaging light entering the lens from different directions.
  • the pixel a in the first split brightness map and the pixel b in the second split brightness map match each other, where the pixel a may correspond to the image formed at position A in FIG. 1, and the pixel b may correspond to The image formed at position B in Figure 1.
  • the phase difference of the matched pixels can be determined according to the position difference of the matched pixels. .
  • the phase difference corresponding to each of the at least two sub-regions is determined according to the phase difference of the pixels that are matched with each other.
  • the electronic device determines a phase difference corresponding to each of the at least two sub-regions according to the mutually matched phase differences.
  • the electronic device can obtain two phase differences corresponding to each sub-region according to the phase differences of the pixels that are matched with each other, which are the vertical phase difference and the horizontal phase difference, respectively.
  • the electronic device can obtain the vertical phase difference confidence level and the horizontal phase difference confidence level corresponding to each sub-region, determine the phase difference with the highest confidence level, and use the phase difference as a phase difference corresponding to each sub-region.
  • the image processing method in this embodiment obtains a target brightness map according to the brightness values of the pixel points included in each pixel point group in the image sensor. After the target brightness map is obtained, the target brightness map is segmented and processed according to As a result of the segmentation processing, the first segmented brightness map and the second segmented brightness map are obtained. Then, based on the position difference of the matching pixels in the first segmented brightness map and the second segmented brightness map, the pixels that match each other are determined Then, according to the phase difference of the matched pixels, the phase difference corresponding to each of the at least two sub-regions is determined. In this way, the brightness value of the pixel points included in each pixel point group in the image sensor can be used to determine the phase difference.
  • the phase difference of the matched pixels in the embodiment of the present application contains relatively rich phase difference information.
  • the acquired phase difference accuracy can be improved, so that when focusing, a high-precision phase difference corresponding to the focus area can be obtained, the focus peak can not be found, the focus efficiency is improved, and the overall in-focus image synthesis efficiency is improved.
  • segmentation processing is performed on the target brightness map, and the first segmented brightness map and the second segmented brightness map are obtained according to the results of the segmentation processing, including:
  • each brightness map region includes a row of pixels in the target brightness map, or each brightness map region includes a column of pixels in the target brightness map .
  • each luminance map area includes a row of pixels in the target luminance map.
  • each luminance map area includes a column of pixels in the target luminance map.
  • the electronic device may segment the target brightness map column by column along the row direction to obtain multiple pixel columns of the target brightness map.
  • the electronic device may segment the target brightness map row by row along the column direction to obtain multiple pixel rows of the target brightness map.
  • Operation (b2) obtain a plurality of first brightness map regions and a plurality of second brightness map regions from a plurality of brightness map regions, where the first brightness map region includes pixels in even rows of the target brightness map, or the first brightness map The area includes pixels in even-numbered columns in the target luminance map, and the second luminance map area includes pixels in odd-numbered rows in the target luminance map, or the second luminance map area includes pixels in odd-numbered columns in the target luminance map.
  • the first luminance map area includes pixels in even rows of the target luminance map.
  • the first luminance map area includes pixels in even-numbered columns in the target luminance map.
  • the second luminance map area includes pixels in odd rows in the target luminance map. Or, the second luminance map area includes pixels in odd-numbered columns in the target luminance map.
  • the electronic device may determine the even-numbered columns as the first brightness map area, and the odd-numbered columns as the second brightness map area.
  • the electronic device may determine even-numbered rows as the first brightness map area, and odd-numbered rows as the second brightness map area.
  • Operation (b3) is to use a plurality of first brightness map regions to form a first segmented brightness map, and use a plurality of second brightness map regions to form a second segmented brightness map.
  • FIG. 11 is a schematic diagram of performing segmentation processing on the target brightness map in the first direction in an embodiment.
  • FIG. 12 is a schematic diagram of performing segmentation processing on the target brightness map in the second direction in an embodiment.
  • the target brightness map includes 6 rows and 6 columns of pixels, when the target brightness map is segmented column by column, that is, the target brightness map is segmented in the first direction.
  • the electronic device can determine the pixels in the first column, the pixels in the third column, and the pixels in the fifth column of the target brightness map as the first brightness map area, and can set the pixels in the second column, the fourth column and the sixth column of the target brightness map. Determined as the second brightness map area.
  • the electronic device may splice the first brightness map area to obtain a first split brightness map T1, which includes the first column, the third column, and the fifth column of the target brightness map.
  • the electronic device may splice the second brightness map regions to obtain a second segmented brightness map T2, which includes the second column of pixels, the fourth column of pixels, and the sixth column of pixels of the target brightness map.
  • the target brightness map includes 6 rows and 6 columns of pixels
  • the target brightness map when the target brightness map is segmented row by row, that is, the target brightness map is segmented in the second direction.
  • the electronic device can determine the pixels in the first row, the pixels in the third row, and the pixels in the fifth row of the target brightness map as the first brightness map area, and can set the pixels in the second row, the fourth row and the sixth row of the target brightness map. Determined as the second brightness map area, and then, the electronic device can splice the first brightness map area to obtain the first sub-brightness map T3.
  • the first sub-brightness map T3 includes the pixels in the first row and the first row of the target brightness map. 3 rows of pixels and 5th row of pixels.
  • the electronic device may splice the second brightness map regions to obtain a second segmented brightness map T4, which includes the second row of pixels, the fourth row of pixels, and the sixth row of pixels of the target brightness map.
  • the image processing method in the embodiment of the present application does not need to block pixels to obtain the phase difference, and obtains relatively rich phase difference information by means of brightness segmentation, which improves the accuracy of the obtained phase difference.
  • the electronic device includes an image sensor, the image sensor includes a plurality of pixel point groups arranged in an array, each of the pixel point groups includes M*N pixel points arranged in an array; each pixel point corresponds to one In the photosensitive unit, both M and N are natural numbers greater than or equal to 2.
  • the phase difference corresponding to each sub-region includes a horizontal phase difference and a vertical phase difference. Obtain the phase difference corresponding to each sub-region in at least two sub-regions, including: when the sub-region is detected to contain horizontal lines, the vertical phase difference is regarded as the phase difference corresponding to the sub-region; when the sub-region does not contain horizontal lines , Regard the horizontal phase difference as the phase difference corresponding to the sub-region.
  • the lines may have problems such as smearing.
  • the vertical phase difference is used as the phase difference corresponding to the sub-region; when it is detected that the sub-region contains vertical lines, the horizontal phase difference is used as the phase difference corresponding to the sub-region, which can improve the phase Poor acquisition accuracy, thereby improving image clarity.
  • focusing according to the phase difference of each target to obtain an image corresponding to the phase difference of each target includes: taking the sub-area corresponding to the phase difference of each target as the focus area to obtain the image corresponding to the phase difference of each target .
  • the electronic device uses a sub-area corresponding to each target phase difference of the at least two target phase differences as a focus area, and obtains an image corresponding to each target phase difference.
  • at least two target phase differences include target phase difference A, target phase difference B, and target phase difference C.
  • the sub-area corresponding to the target phase difference A is used as the focus area, and focus is performed to obtain an image corresponding to the target phase difference A.
  • the sub-area corresponding to the target phase difference B is used as the focus area, and focus is performed to obtain an image corresponding to the target phase difference B.
  • the sub-region corresponding to the target phase difference C is used as the focus area, and focus is performed to obtain an image corresponding to the target phase difference C. That is, a total of three images are obtained.
  • the image processing method in this embodiment uses the sub-region corresponding to each target phase difference as a focus area to obtain an image corresponding to each target phase difference, and can obtain images with different focal points for synthesis, thereby improving image clarity.
  • synthesizing the images corresponding to each target phase difference to obtain a fully in-focus image includes: dividing the image corresponding to each target phase difference into the same number of sub-image areas; obtaining the corresponding sub-image area According to the corresponding definition of each sub-image area, determine the sub-image area with the highest definition among the matching sub-image areas; stitch and combine the sub-image areas with the highest definition to obtain a fully in-focus image.
  • the sub-image areas that match each other refer to sub-image areas located at the same position in different images.
  • the electronic device divides the image corresponding to each target phase difference into the same number of sub-image areas.
  • the electronic device obtains the definition corresponding to each sub-image area in the image corresponding to each target phase difference.
  • the electronic device determines the sub-image area with the highest definition among the matched sub-image areas according to the corresponding definition of each sub-image area of each sub-image.
  • the electronic device synthesizes all the sub-image areas with the highest definition to obtain a fully in-focus image.
  • the target phase difference A corresponds to image A, and image A is divided into sub-image area 1, sub-image area 2, sub-image area 3, and sub-image area 4.
  • the target phase difference B corresponds to image B, which is divided into sub-image area ⁇ , sub-image area ⁇ , sub-image area ⁇ , and sub-image area ⁇ .
  • the sub-image area 1 is located at the upper left corner of the image A
  • the sub-image area ⁇ is located at the upper left corner of the image B
  • the sub-image area 1 matches the sub-image area ⁇ ... and so on. If the sub-image area 1, sub-image area ⁇ , sub-image area ⁇ , and sub-image area 4 with the highest definition, the electronic device will splice and synthesize sub-image area 1, sub-image area ⁇ , sub-image area ⁇ , and sub-image area 4. Obtain a fully in-focus image.
  • the image corresponding to each target phase difference is divided into the same number of sub-image areas; the definition corresponding to each sub-image area is obtained; and the mutual definition is determined according to the corresponding definition of each sub-image area.
  • the sub-image area with the highest definition among the matched sub-image areas; the sub-image area with the highest definition is synthesized to obtain a full in-focus image, which can quickly obtain a full-in-focus image and improve image processing efficiency.
  • FIG. 13 it is a schematic flow chart of synthesizing to obtain a full in-focus image in an embodiment. Synthesize the images corresponding to the phase difference of each target to obtain a fully in-focus image, including:
  • Operation 1302 convolution and sampling processing of the image corresponding to each target phase difference, and when the preset iterative condition is met, a Gaussian pyramid of the image corresponding to each target phase difference is obtained.
  • the Gaussian pyramid is a kind of image pyramid. Except for the bottom layer image, the other layer images in the pyramid are all obtained by convolving and sampling the previous layer image. Gaussian pyramids can be used to obtain low-frequency images.
  • the low-frequency image can refer to the contour image in the image.
  • the iteration condition may mean reaching a preset number of times or reaching a preset time, etc., and is not limited thereto.
  • the image corresponding to each target phase difference has a corresponding Gaussian pyramid. For example, the image A corresponding to the target phase difference corresponds to the Gaussian pyramid A, and the image B corresponding to the target phase difference corresponds to the Gaussian pyramid B.
  • the electronic device uses a Gaussian kernel to convolve the image corresponding to each target phase difference, and sample the convolved image to obtain each layer of image. That is, the image (set as G0) corresponding to each target phase difference is convolved and sampled to obtain the upper low-frequency image (G1); then image G1 is convolved and sampled to obtain image G2...until the preset is satisfied
  • the iterative condition for example, iterates 5 times to obtain image G5, and obtains a Gaussian pyramid containing multiple low-frequency images corresponding to each target phase difference.
  • processing is performed according to each layer of the image in the Gaussian pyramid of the image corresponding to each target phase difference to obtain the Laplacian pyramid of the image corresponding to each target phase difference.
  • the Laplacian Pyramid (LP) is defined.
  • the image corresponding to each target phase difference has a corresponding Laplacian pyramid.
  • the image A corresponding to the target phase difference corresponds to the Laplacian Pyramid A
  • the image B corresponding to the target phase difference corresponds to the Laplacian Pyramid B.
  • Each layer of the Laplace Pyramid represents different scales and details. Among them, the details can be regarded as frequency.
  • the electronic device obtains the high-frequency image by subtracting the up-sampled low-frequency image from the original image.
  • L1, L2, L3, L4... and then the Laplacian pyramid of the image corresponding to each target phase difference can be obtained.
  • the Laplacian pyramid of the image corresponding to the phase difference of each target is fused to obtain a fused Laplacian pyramid.
  • the electronic device obtains the weight of the image corresponding to each target phase difference, and performs fusion according to the weight of the image corresponding to each target phase difference and the Laplacian pyramid of the image corresponding to each target phase difference, to obtain the fused image Pyramid of Laplace.
  • fusion formula is as follows:
  • L5 refers to the sixth layer of the fused Laplacian pyramid from the bottom to the top.
  • Weight refers to the weight of Figure 1.
  • Weight refers to the weight of Figure 2.
  • L5 refers to the sixth layer of the Laplacian Pyramid in Figure 1 from the bottom up.
  • L5 refers to the sixth layer of the Laplacian Pyramid in Figure 2 counted from the bottom up.
  • the weight of each image can be adjusted according to parameters such as depth of field and degree of blur. For example, regions with a low degree of ambiguity are highly weighted. Areas with a high degree of blur have a small weight. Areas with a small depth of field are more powerful. The area with a large depth of field has a small weight.
  • reconstruction processing is performed according to the fused Laplacian pyramid to obtain a fully in-focus image.
  • electronic devices merge from the top to the bottom.
  • G5 can be obtained by fuse of the G5 layer image of the Gaussian pyramid corresponding to each target phase difference.
  • L5 (fusion) is the L5 layer of the Laplace pyramid after fusion.
  • R5 (fusion) is the G5 layer of the Laplace Pyramid after reconstruction (Reconstruction).
  • L4 fusion
  • R0 the final synthesis result
  • the image convolution and sampling processing for each target phase difference is processed, and when the preset iterative conditions are met, the Gaussian pyramid of the image corresponding to each target phase difference is obtained, and the Gaussian pyramid is obtained according to each target phase difference.
  • Each layer of the image in the Gaussian pyramid of the image corresponding to the difference is processed to obtain the Laplacian pyramid of the image corresponding to each target phase difference, and the Laplacian pyramid of the image corresponding to each target phase difference is merged to obtain
  • the fused Laplacian Pyramid is reconstructed according to the fused Laplacian Pyramid to obtain a fully in-focus image.
  • the image can be synthesized according to the low-frequency contour and high-frequency details to make the boundary between each area more Naturally, improve the authenticity and clarity of the image.
  • FIG. 14 it is a schematic flow chart of synthesizing to obtain a full in-focus image in another embodiment. Synthesize the images corresponding to the phase difference of each target to obtain a fully in-focus image, including:
  • FIG. 15 is a schematic flowchart of synthesizing to obtain a full in-focus image in another embodiment.
  • the electronic device uses a convolutional neural network to perform convolution processing on the image corresponding to each target phase difference, and perform feature extraction. For example, in FIG. 15, convolution ⁇ feature extraction ⁇ convolution is performed on image 1. Perform convolution ⁇ feature extraction ⁇ convolution on image 2.
  • the features of the image corresponding to each target phase difference are fused to obtain the first image feature.
  • the electronic device fuses the features of the image corresponding to each target phase difference, and calculates the activation function to obtain the first image feature.
  • the image corresponding to each target phase difference is averaged to obtain an average image.
  • the electronic device performs averaging processing on the brightness value of the image corresponding to each target phase difference to obtain an average image.
  • Operation 1408 Perform feature extraction according to the average image and the first image feature to obtain the second image feature.
  • the electronic device performs feature extraction according to the average image and the first image feature to obtain the second image feature.
  • Operation 1410 Perform feature reconstruction according to the second image feature and the average image to obtain a fully in-focus image.
  • the electronic device performs feature reconstruction according to the second image feature and the average image to obtain a fully in-focus image.
  • the image processing method in the embodiment of the present application extracts the features of the image corresponding to each target phase difference, fuses the features of the image corresponding to each target phase difference, and obtains the first image feature.
  • For the image corresponding to each target phase difference Perform averaging processing to obtain an average image, perform feature extraction based on the average image and the first image feature, obtain the second image feature, perform feature reconstruction based on the second image feature and the average image, and obtain a fully in-focus image.
  • the neural network can be used. Synthesize images to improve the accuracy and clarity of image synthesis.
  • acquiring the preview image includes: acquiring the region of interest in the preview image; and dividing the preview image into at least two subregions includes: dividing the region of interest in the preview image into at least two subregions.
  • the region of interest refers to that in image processing, the area to be processed is outlined in the form of boxes, circles, ellipses, irregular polygons, etc., from the processed image.
  • the area of interest can include background and objects.
  • the electronic device receives the trigger instruction on the first preview image, and obtains the region of interest selected by the user according to the trigger instruction.
  • the electronic device divides the region of interest into at least two sub-regions.
  • the electronic device may divide the region of interest selected by the user into N ⁇ N sub-regions.
  • the electronic device may divide the region of interest selected by the user into N ⁇ M sub-regions, etc., which is not limited thereto. Both N and M are positive integers.
  • the image processing method in the embodiment of the present application obtains the region of interest in the preview image, divides the region of interest in the preview image into at least two sub-regions, and can focus according to the region of interest, thereby ensuring the scene in the region of interest Clear, improve the image clarity of the region of interest in the full focus image.
  • determining the at least two target phase differences according to the phase difference corresponding to each sub-region includes: acquiring a scene mode; and determining the at least two target phase differences according to the scene mode.
  • each scene mode can correspond to different types of target phase differences.
  • the scene mode may be a night scene mode, a panoramic mode, etc., and is not limited thereto.
  • the target phase difference corresponding to the A scene mode is a foreground phase difference and a background phase difference.
  • the target phase difference corresponding to the B scene mode is the foreground phase difference, the median phase difference, and the background phase difference, etc., which are not limited to this.
  • the image processing method in the embodiment of the application obtains the scene mode, determines at least two target phase differences according to the scene mode, can quickly determine the target phase difference according to different scene modes, achieves the effect corresponding to different scenes, and improves image processing efficiency and image effects The clarity.
  • Fig. 16 is a structural block diagram of an image processing apparatus according to an embodiment.
  • an image processing device includes a preview image acquisition module 1602, a division module 1604, a phase difference acquisition module 1606, a focus module 1608, and a synthesis module 1610, in which:
  • the preview image acquisition module 1602 is used to acquire a preview image.
  • the dividing module 1604 is configured to divide the preview image into at least two sub-areas.
  • the phase difference acquiring module 1606 is configured to acquire the phase difference corresponding to each of the at least two sub-regions.
  • the phase difference acquisition module 1606 is further configured to determine at least two target phase differences from the phase differences corresponding to each sub-region, and the at least two target phase differences include the target foreground phase difference and the target background phase difference.
  • the focusing module 1608 is used for focusing according to the phase difference of each target to obtain an image corresponding to the phase difference of each target.
  • the synthesizing module 1610 is used to synthesize the images corresponding to each target phase difference to obtain a fully in-focus image.
  • the image processing device in this embodiment obtains a preview image, divides the preview image into at least two sub-regions, obtains the phase difference corresponding to each sub-region in the at least two sub-regions, and determines at least two phase differences from the phase difference corresponding to each sub-region.
  • Target phase difference, at least two target phase differences include the target foreground phase difference and the target background phase difference.
  • Focusing is performed according to the phase difference of each target to obtain an image corresponding to the phase difference of each target, and at least two targets at different focal points can be acquired.
  • One is the background in-focus image
  • the other is the foreground in-focus image.
  • the images corresponding to the phase difference of each target are synthesized to obtain a fully in-focus image, which can obtain an image with less out-of-focus area and improve the image The clarity.
  • the phase difference obtaining module 1606 is configured to divide the phase difference corresponding to each of the at least two sub-regions into a foreground phase difference set and a background phase difference set; and obtain the first phase difference mean value corresponding to the foreground phase difference set ; Obtain the second average phase difference corresponding to the background phase difference set; use the first average phase difference as the target foreground phase difference; use the second average phase difference as the target background phase difference.
  • the image processing device in this embodiment divides the phase difference corresponding to each of the at least two sub-regions into a foreground phase difference set and a background background phase difference set, and obtains the first phase difference mean value corresponding to the foreground phase difference set, and obtains The second average value of phase difference corresponding to the background phase difference set of the background, the first average value of phase difference is taken as the target foreground phase difference, and the second average value of phase difference is taken as the target background phase difference, the foreground in-focus image and background in-focus image can be obtained according to the average value Image, improve the clarity of the image.
  • the phase difference acquisition module 1606 is used to exclude the maximum phase difference in the phase difference corresponding to the sub-region to obtain the remaining phase difference set; divide the remaining phase difference set into a foreground phase difference set and a background background phase difference set .
  • the largest phase difference in the phase difference corresponding to the sub-region is excluded to obtain the remaining phase difference set, which can eliminate the farthest background and reduce the remaining phase difference.
  • the set is divided into a foreground phase difference set and a background background phase difference set. Focusing based on the average value can improve the clarity of the image.
  • the phase difference obtaining module 1606 is configured to obtain the maximum phase difference and the minimum phase difference among the phase differences of at least two sub-regions; the minimum phase difference is regarded as the foreground phase difference; and the maximum phase difference is regarded as the background phase difference.
  • the image processing device in this embodiment obtains the maximum phase difference and the minimum phase difference among the phase differences of at least two sub-regions, and uses the minimum phase difference as the foreground phase difference and the maximum phase difference as the background phase difference, so that only two The image is synthesized to improve image processing efficiency while improving image clarity.
  • the electronic device includes an image sensor, the image sensor includes a plurality of pixel point groups arranged in an array, each of the pixel point groups includes M*N pixel points arranged in an array; each pixel point corresponds to one In the photosensitive unit, both M and N are natural numbers greater than or equal to 2.
  • the phase difference obtaining module 1606 is used to obtain the target brightness map according to the brightness values of the pixels included in each pixel point group; perform segmentation processing on the target brightness map, and obtain the first segment brightness map and the second segment brightness map according to the results of the segmentation processing.
  • Split brightness map determine the phase difference of the matching pixels according to the position difference of the matching pixels in the first split brightness map and the second split brightness map; determine at least two sub-regions according to the phase difference of the matching pixels The phase difference corresponding to each sub-area in.
  • the image processing device in this embodiment obtains a target brightness map according to the brightness values of the pixel points included in each pixel point group in the image sensor, and after obtaining the target brightness map, performs segmentation processing on the target brightness map, according to As a result of the segmentation processing, the first segmented brightness map and the second segmented brightness map are obtained. Then, based on the position difference of the matching pixels in the first segmented brightness map and the second segmented brightness map, the pixels that match each other are determined Then, according to the phase difference of the matched pixels, the phase difference corresponding to each of the at least two sub-regions is determined. In this way, the brightness value of the pixel points included in each pixel point group in the image sensor can be used to determine the phase difference.
  • the phase difference of the matched pixels in the embodiment of the present application contains relatively rich phase difference information. The accuracy of the phase difference obtained can be improved.
  • the phase difference acquisition module 1606 is used to perform segmentation processing on the target brightness map to obtain multiple brightness map regions, each brightness map region includes a row of pixels in the target brightness map, or each brightness map region Including a column of pixels in the target brightness map; obtaining multiple first brightness map regions and multiple second brightness map regions from multiple brightness map regions, the first brightness map region including pixels in even rows of the target brightness map, or,
  • the first luminance map area includes the pixels in the even-numbered columns of the target luminance map
  • the second luminance map area includes the pixels in the odd-numbered rows of the target luminance map, or the second luminance map area includes the pixels in the odd-numbered columns in the target luminance map
  • the first brightness map area composes a first segmented brightness map
  • multiple second brightness map areas are used to compose a second segmented brightness map.
  • the image processing device in the embodiment of the present application does not need to shield the pixels to obtain the phase difference, and obtains relatively rich phase difference information by means of brightness segmentation, which improves the accuracy of the obtained phase difference.
  • the phase difference acquisition module 1606 is configured to use the vertical phase difference as the phase difference corresponding to the sub-region when it is detected that the sub-region contains horizontal lines; when it is detected that the sub-region does not contain horizontal lines, the horizontal
  • the phase difference is regarded as the phase difference corresponding to the sub-region.
  • the lines may have problems such as smearing.
  • the vertical phase difference is used as the phase difference corresponding to the sub-region; when it is detected that the sub-region contains vertical lines, the horizontal phase difference is used as the phase difference corresponding to the sub-region, which can improve the phase Poor acquisition accuracy, thereby improving image clarity.
  • the focusing module 1608 is configured to use the sub-area corresponding to each target phase difference as a focus area to obtain an image corresponding to each target phase difference.
  • the image processing device in this embodiment uses the sub-area corresponding to each target phase difference as a focus area to obtain an image corresponding to each target phase difference, and can obtain images with different focal points for synthesis, thereby improving image clarity.
  • the synthesis module 1610 is configured to divide the image corresponding to each target phase difference into the same number of sub-image areas; obtain the definition corresponding to each sub-image area; determine according to the definition corresponding to each sub-image area The sub-image area with the highest definition among the matching sub-image areas; the sub-image area with the highest definition is synthesized to obtain a fully in-focus image.
  • the image processing device in the embodiment of the present application divides the image corresponding to each target phase difference into the same number of sub-image areas; obtains the definition corresponding to each sub-image area; and determines the mutual resolution according to the corresponding definition of each sub-image area.
  • the sub-image area with the highest definition among the matched sub-image areas; the sub-image area with the highest definition is synthesized to obtain a full in-focus image, which can quickly obtain a full-in-focus image and improve image processing efficiency.
  • the synthesis module 1610 is used for convolution and sampling processing of the image corresponding to each target phase difference, and when the preset iterative conditions are met, the Gaussian pyramid of the image corresponding to each target phase difference is obtained; Each layer of the Gaussian pyramid of the image corresponding to the target phase difference is processed to obtain the Laplacian pyramid of the image corresponding to each target phase difference; the Laplacian pyramid of the image corresponding to each target phase difference is fused , The fused Laplacian pyramid is obtained; reconstruction processing is performed according to the fused Laplacian pyramid to obtain a fully in-focus image.
  • the image processing device in this embodiment performs convolution and sampling processing on the image corresponding to each target phase difference, and when the preset iterative conditions are met, the Gaussian pyramid of the image corresponding to each target phase difference is obtained, and the Gaussian pyramid is obtained according to each target phase difference.
  • Each layer of the image in the Gaussian pyramid of the image corresponding to the difference is processed to obtain the Laplacian pyramid of the image corresponding to each target phase difference, and the Laplacian pyramid of the image corresponding to each target phase difference is merged to obtain
  • the fused Laplacian Pyramid is reconstructed according to the fused Laplacian Pyramid to obtain a fully in-focus image, which can make the boundary between various regions more natural and improve the authenticity and clarity of the image.
  • the synthesis module 1610 is used to extract the features of the image corresponding to each target phase difference; fuse the features of the image corresponding to each target phase difference to obtain the first image feature; The image is averaged to obtain an average image; feature extraction is performed based on the average image and the first image feature to obtain the second image feature; the second image feature is reconstructed based on the second image feature and the average image to obtain a fully in-focus image.
  • the image processing device in the embodiment of the present application extracts the features of the image corresponding to each target phase difference, fuses the features of the image corresponding to each target phase difference, and obtains the first image feature.
  • For the image corresponding to each target phase difference Perform averaging processing to obtain an average image, perform feature extraction based on the average image and the first image feature, obtain the second image feature, perform feature reconstruction based on the second image feature and the average image, and obtain a fully in-focus image.
  • the neural network can be used Synthesize images to improve the accuracy and clarity of image synthesis.
  • the preview image acquisition module 1602 is used to acquire the region of interest in the preview image.
  • the dividing module 1604 is configured to divide the region of interest in the preview image into at least two sub-regions.
  • the image processing device in the application embodiment obtains the region of interest in the preview image, divides the region of interest in the preview image into at least two sub-regions, and can focus according to the region of interest, thereby ensuring that the scene in the region of interest is clear , To improve the image clarity of the region of interest in the all-in-focus image.
  • the phase difference obtaining module 1606 is used to obtain a scene mode; and determine at least two target phase differences according to the scene mode.
  • the image processing device in the embodiment of the present application obtains the scene mode, determines at least two target phase differences according to the scene mode, can quickly determine the target phase difference according to different scene modes, achieves the effect corresponding to different scenes, and improves image processing efficiency and image effects The clarity.
  • the division of the modules in the above-mentioned image processing apparatus is only for illustration. In other embodiments, the image processing apparatus may be divided into different modules as required to complete all or part of the functions of the above-mentioned image processing apparatus.
  • Each module in the above-mentioned image processing device may be implemented in whole or in part by software, hardware, and a combination thereof.
  • the above-mentioned modules may be embedded in the form of hardware or independent of the processor in the computer equipment, or may be stored in the memory of the computer equipment in the form of software, so that the processor can call and execute the operations corresponding to the above-mentioned modules.
  • Fig. 17 is a schematic diagram of the internal structure of an electronic device in an embodiment.
  • the electronic device includes a processor and a memory connected through a system bus.
  • the processor is used to provide computing and control capabilities to support the operation of the entire electronic device.
  • the memory may include a non-volatile storage medium and internal memory.
  • the non-volatile storage medium stores an operating system and a computer program.
  • the computer program can be executed by the processor to implement an image processing method provided in the following embodiments.
  • the internal memory provides a cached operating environment for the operating system computer program in the non-volatile storage medium.
  • the electronic device can be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device.
  • each module in the image processing apparatus provided in the embodiment of the present application may be in the form of a computer program.
  • the computer program can be run on a terminal or a server.
  • the program module composed of the computer program can be stored in the memory of the terminal or the server.
  • the embodiment of the present application also provides a computer-readable storage medium.
  • a computer program product containing instructions that, when run on a computer, causes the computer to execute an image processing method.
  • Non-volatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory may include random access memory (RAM), which acts as external cache memory.
  • RAM is available in many forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDR SDRAM double data rate SDRAM
  • ESDRAM enhanced SDRAM
  • SLDRAM synchronous Link (Synchlink) DRAM
  • Rambus direct RAM
  • DRAM direct memory bus dynamic RAM
  • RDRAM memory bus dynamic RAM

Abstract

L'invention concerne un procédé de traitement d'image, consistant : à acquérir une image de prévisualisation ; à diviser l'image de prévisualisation en au moins deux sous-zones ; à acquérir une différence de phase correspondant à chaque sous-zone parmi lesdites deux sous-zones au moins ; à déterminer au moins deux différences de phase cibles à partir de différences de phase correspondant à chaque sous-zone, lesdites deux différences de phase cibles au moins comprenant une différence de phase d'avant-plan cible et une différence de phase d'arrière-plan cible ; à effectuer une mise au point selon chaque différence de phase cible pour obtenir une image correspondant à chaque différence de phase cible ; et à combiner les images correspondant à chaque différence de phase cible pour obtenir une image à quasi-focalisation complète.
PCT/CN2020/126122 2019-11-12 2020-11-03 Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible par ordinateur WO2021093635A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911101432.0 2019-11-12
CN201911101432.0A CN112866549B (zh) 2019-11-12 2019-11-12 图像处理方法和装置、电子设备、计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2021093635A1 true WO2021093635A1 (fr) 2021-05-20

Family

ID=75912513

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/126122 WO2021093635A1 (fr) 2019-11-12 2020-11-03 Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN112866549B (fr)
WO (1) WO2021093635A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113468702A (zh) * 2021-07-22 2021-10-01 久瓴(江苏)数字智能科技有限公司 管线排布方法、装置以及计算机可读存储介质
CN113962859A (zh) * 2021-10-26 2022-01-21 北京有竹居网络技术有限公司 一种全景图生成方法、装置、设备及介质
CN115022535A (zh) * 2022-05-20 2022-09-06 深圳福鸽科技有限公司 图像处理方法、装置及电子设备
CN115314635A (zh) * 2022-08-03 2022-11-08 Oppo广东移动通信有限公司 用于离焦量确定的模型训练方法及装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113259596B (zh) * 2021-07-14 2021-10-08 北京小米移动软件有限公司 图像生成方法、相位检测对焦方法及装置
CN114040081A (zh) * 2021-11-30 2022-02-11 维沃移动通信有限公司 图像传感器、摄像模组、电子设备、对焦方法及介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012215700A (ja) * 2011-03-31 2012-11-08 Fujifilm Corp 撮像装置及び撮像プログラム
CN105100615A (zh) * 2015-07-24 2015-11-25 青岛海信移动通信技术股份有限公司 一种图像的预览方法、装置及终端
CN106031154A (zh) * 2014-02-19 2016-10-12 三星电子株式会社 处理图像的方法和用于其的电子设备
CN106454289A (zh) * 2016-11-29 2017-02-22 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
CN110166680A (zh) * 2019-06-28 2019-08-23 Oppo广东移动通信有限公司 设备成像方法、装置、存储介质及电子设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2277855A1 (fr) * 1999-07-14 2001-01-14 Solvision Methode et systeme de mesure de la hauteur des billes de soudure d'un circuit imprime
CN105120154A (zh) * 2015-08-20 2015-12-02 深圳市金立通信设备有限公司 一种图像处理方法及终端
CN106060407A (zh) * 2016-07-29 2016-10-26 努比亚技术有限公司 一种对焦方法及终端
CN106572305A (zh) * 2016-11-03 2017-04-19 乐视控股(北京)有限公司 一种图像拍摄与处理方法、装置及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012215700A (ja) * 2011-03-31 2012-11-08 Fujifilm Corp 撮像装置及び撮像プログラム
CN106031154A (zh) * 2014-02-19 2016-10-12 三星电子株式会社 处理图像的方法和用于其的电子设备
CN105100615A (zh) * 2015-07-24 2015-11-25 青岛海信移动通信技术股份有限公司 一种图像的预览方法、装置及终端
CN106454289A (zh) * 2016-11-29 2017-02-22 广东欧珀移动通信有限公司 控制方法、控制装置及电子装置
CN110166680A (zh) * 2019-06-28 2019-08-23 Oppo广东移动通信有限公司 设备成像方法、装置、存储介质及电子设备

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113468702A (zh) * 2021-07-22 2021-10-01 久瓴(江苏)数字智能科技有限公司 管线排布方法、装置以及计算机可读存储介质
CN113468702B (zh) * 2021-07-22 2024-03-22 久瓴(江苏)数字智能科技有限公司 管线排布方法、装置以及计算机可读存储介质
CN113962859A (zh) * 2021-10-26 2022-01-21 北京有竹居网络技术有限公司 一种全景图生成方法、装置、设备及介质
CN115022535A (zh) * 2022-05-20 2022-09-06 深圳福鸽科技有限公司 图像处理方法、装置及电子设备
CN115022535B (zh) * 2022-05-20 2024-03-08 深圳福鸽科技有限公司 图像处理方法、装置及电子设备
CN115314635A (zh) * 2022-08-03 2022-11-08 Oppo广东移动通信有限公司 用于离焦量确定的模型训练方法及装置
CN115314635B (zh) * 2022-08-03 2024-03-26 Oppo广东移动通信有限公司 用于离焦量确定的模型训练方法及装置

Also Published As

Publication number Publication date
CN112866549A (zh) 2021-05-28
CN112866549B (zh) 2022-04-12

Similar Documents

Publication Publication Date Title
WO2021093635A1 (fr) Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible par ordinateur
KR102278776B1 (ko) 이미지 처리 방법, 기기, 및 장치
CN110428366B (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
US8749694B2 (en) Methods and apparatus for rendering focused plenoptic camera data using super-resolved demosaicing
CN110536057B (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
US8724000B2 (en) Methods and apparatus for super-resolution in integral photography
JP6168794B2 (ja) 情報処理方法および装置、プログラム。
US8340512B2 (en) Auto focus technique in an image capture device
EP2504992A2 (fr) Appareil de traitement d'images et procédé
WO2021082883A1 (fr) Procédé et appareil de détection de corps principal, dispositif électronique et support de stockage lisible par ordinateur
JP2019533957A (ja) 端末のための撮影方法及び端末
US11282176B2 (en) Image refocusing
US10469728B2 (en) Imaging device having a lens array of micro lenses
JP2015088833A (ja) 画像処理装置、撮像装置及び画像処理方法
CN112087571A (zh) 图像采集方法和装置、电子设备、计算机可读存储介质
JP6544978B2 (ja) 画像出力装置およびその制御方法、撮像装置、プログラム
CN112866675B (zh) 深度图生成方法和装置、电子设备、计算机可读存储介质
Georgiev et al. Rich image capture with plenoptic cameras
CN112019734B (zh) 图像采集方法、装置、电子设备和计算机可读存储介质
JP6976754B2 (ja) 画像処理装置および画像処理方法、撮像装置、プログラム
CN112866655B (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
CN112866547B (zh) 对焦方法和装置、电子设备、计算机可读存储介质
WO2021093528A1 (fr) Procédé et appareil de mise au point, dispositif électronique et support de stockage lisible par ordinateur
CN112866554B (zh) 对焦方法和装置、电子设备、计算机可读存储介质
WO2021093502A1 (fr) Procédé et appareil d'obtention de différence de phase, et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20888614

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20888614

Country of ref document: EP

Kind code of ref document: A1