CN109685853B - Image processing method, image processing device, electronic equipment and computer readable storage medium - Google Patents

Image processing method, image processing device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN109685853B
CN109685853B CN201811454206.6A CN201811454206A CN109685853B CN 109685853 B CN109685853 B CN 109685853B CN 201811454206 A CN201811454206 A CN 201811454206A CN 109685853 B CN109685853 B CN 109685853B
Authority
CN
China
Prior art keywords
image
target
camera
calibration information
camera module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811454206.6A
Other languages
Chinese (zh)
Other versions
CN109685853A (en
Inventor
陈岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201811454206.6A priority Critical patent/CN109685853B/en
Publication of CN109685853A publication Critical patent/CN109685853A/en
Application granted granted Critical
Publication of CN109685853B publication Critical patent/CN109685853B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The application relates to an image processing method, an image processing device, an electronic device and a computer readable storage medium. The method comprises the following steps: when an initial image is collected through the camera module, a collection aperture value corresponding to the camera module is obtained, first target calibration information corresponding to the collection aperture value is obtained from calibration information, wherein the calibration information is obtained by calibrating the camera module under different aperture values, and the initial image is corrected according to the first target calibration information to obtain a target image. The initial image can be processed by acquiring the corresponding calibration information according to the acquired aperture value, so that the accuracy of image processing can be improved.

Description

Image processing method, image processing device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of image technologies, and in particular, to an image processing method and apparatus, an electronic device, and a computer-readable storage medium.
Background
With the development of imaging technology, the phenomenon of using electronic devices to capture images is becoming more common. The electronic equipment collects images through the camera module, and can enable objects in the three-dimensional space to be imaged in the images. In the process of image shooting, images can be shot by adopting different apertures, shutter speeds, sensitivities and the like, however, different shooting conditions can affect the effect of the shot images, and the problem of low image processing accuracy exists.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, electronic equipment and a computer readable storage medium, which can improve the accuracy of image processing.
An image processing method comprising:
acquiring a collection aperture value corresponding to a camera module when an initial image is collected through the camera module;
acquiring first target calibration information corresponding to the acquired aperture value from calibration information, wherein the calibration information is obtained by calibrating the camera module under different aperture values;
and correcting the initial image according to the first target calibration information to obtain a target image.
An image processing apparatus comprising:
the camera module is used for acquiring an initial image and acquiring a collection aperture value corresponding to the camera module;
the calibration information acquisition module is used for acquiring first target calibration information corresponding to the acquired aperture value from calibration information, wherein the calibration information is obtained by calibrating the camera module under different aperture values;
and the processing module is used for correcting the initial image according to the first target calibration information to obtain a target image.
An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of:
acquiring a collection aperture value corresponding to a camera module when an initial image is collected through the camera module;
acquiring first target calibration information corresponding to the acquired aperture value from calibration information, wherein the calibration information is obtained by calibrating the camera module under different aperture values;
and correcting the initial image according to the first target calibration information to obtain a target image.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring a collection aperture value corresponding to a camera module when an initial image is collected through the camera module;
acquiring first target calibration information corresponding to the acquired aperture value from calibration information, wherein the calibration information is obtained by calibrating the camera module under different aperture values;
and correcting the initial image according to the first target calibration information to obtain a target image.
According to the image processing method, the image processing device, the electronic equipment and the computer readable storage medium, when the camera module collects the initial image, the collection aperture value corresponding to the camera module is obtained, the first target calibration information corresponding to the collection aperture value is obtained from the calibration information, and the initial image is corrected according to the first target calibration information to obtain the target image. The initial image can be processed by acquiring the corresponding calibration information according to the acquired aperture value, so that the accuracy of image processing can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an exemplary embodiment of an image processing method;
FIG. 2 is a flow diagram of a method of image processing in one embodiment;
FIG. 3 is a flowchart illustrating the calibration of a camera module according to an embodiment;
FIG. 4 is a schematic diagram illustrating calibration of a camera module according to an embodiment;
FIG. 5 is a flowchart of an embodiment of initial image correction processing
FIG. 6 is a flow diagram of a method of image processing in one embodiment;
FIG. 7 is a flowchart of an image processing method in another embodiment;
FIG. 8 is a block diagram showing the configuration of an image processing apparatus according to an embodiment;
FIG. 9 is a block diagram showing an internal configuration of an electronic apparatus according to an embodiment;
FIG. 10 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first camera may be referred to as a second camera, and similarly, a second camera may be referred to as a first camera, without departing from the scope of the present application. The first camera and the second camera are both cameras, but they are not the same camera.
Fig. 1 is a schematic diagram of an application environment of an image processing method in an embodiment. As shown in fig. 1, the application environment includes an electronic device 110. The electronic device 110 has a camera module mounted thereon. Specifically, when the electronic device 110 collects an initial image through the camera module, the collected aperture value corresponding to the camera module is obtained, first target calibration information corresponding to the collected aperture value is obtained from the calibration information, and the initial image is corrected according to the first target calibration information to obtain a target image, where the calibration information is obtained by calibrating the camera module at different aperture values. It is understood that the electronic device 110 may be a mobile phone, a computer, a wearable device, etc., and is not limited thereto.
FIG. 2 is a flow diagram of a method of image processing in one embodiment. The image processing method in the present embodiment is described by taking the electronic device 110 in fig. 1 as an example. As shown in fig. 2, the image processing method includes steps 202 to 206. Wherein:
step 202, when the initial image is collected through the camera module, a collection aperture value corresponding to the camera module is obtained.
The camera module is an assembly including at least one camera. The camera module can be arranged in the electronic equipment or arranged outside the electronic equipment, so that the electronic equipment can acquire images through the camera module. Specifically, the camera module comprises at least one multi-aperture camera capable of adjusting different aperture values to shoot, wherein the multi-aperture camera can be a color camera; the camera module can also comprise the multi-aperture camera and one or more of other color cameras, black-and-white cameras, long-focus cameras, wide-angle cameras or depth cameras. For example, the camera module may only include a multi-aperture camera, may also include a color camera and a depth camera, and may also include a color camera, a black-and-white camera, a depth camera, and the like, without being limited thereto. The aperture value is a relative value of the focal length and the light transmission diameter of the camera. The aperture of the camera is a device for controlling the amount of light entering a light sensing surface in the camera, and the size of the aperture can be expressed by an aperture value. The larger the aperture value is, the smaller the light transmission diameter, i.e., the aperture is, the smaller the light input amount per unit time is; conversely, the smaller the aperture value, the larger the light transmission diameter, i.e., the aperture, and the larger the amount of light entering per unit time.
The initial image refers to an image which is acquired by the camera module and is not subjected to correction processing. In particular, the initial image may be an image captured by one or more cameras in the camera module. The collected aperture value is the aperture value corresponding to the multi-aperture camera when the initial image is collected. When the electronic equipment collects an initial image through the camera module, acquiring a collection aperture value of the multi-aperture camera; when the initial image is an image acquired by the multi-aperture camera, the electronic equipment can also directly acquire an acquisition aperture value corresponding to the initial image.
Step 204, obtaining first target calibration information corresponding to the acquired aperture value from the calibration information, wherein the calibration information is obtained by calibrating the camera module under different aperture values.
The calibration information refers to camera parameters obtained after the camera module is calibrated. The calibration information can be used for correcting the image acquired by the camera module, so that the corrected image can restore the object in the three-dimensional space. Specifically, when the camera module only includes one camera, the calibration information includes monocular calibration information corresponding to the camera; when the camera module comprises two or more than two cameras, the calibration information comprises monocular calibration information of each camera and binocular calibration information between the multi-aperture camera and other cameras. The calibration processing refers to the operation of solving parameters in a geometric model imaged by the camera, and the shot image can restore an object in a three-dimensional space through the geometric model imaged by the camera. The electronic equipment can calibrate the camera module under different aperture values to obtain calibration information corresponding to different aperture values respectively, and then the electronic equipment can acquire first target calibration information corresponding to the acquired aperture values from the calibration information corresponding to each aperture value. In one embodiment, when there is no calibration information corresponding to the collected aperture value, the electronic device may acquire calibration information corresponding to an aperture value closest to the collected aperture value as the first target calibration information.
And step 206, correcting the initial image according to the first target calibration information to obtain a target image.
The electronic equipment corrects the initial image according to the first target calibration information, and specifically, when the camera module only comprises one camera, the electronic equipment can directly perform monocular correction on the initial image acquired by the camera according to monocular calibration information contained in the first target calibration information to obtain a target image; when two or more cameras are included in the camera module, the electronic equipment can correct the initial images acquired by the cameras through monocular calibration information included in the target calibration information and binocular calibration information between the multi-aperture camera and other cameras to obtain target images corresponding to the initial images. For example, when the electronic device acquires the initial image a1 through the multi-aperture camera in the camera module, the electronic device may perform correction processing on the initial image a1 according to the monocular calibration information of the multi-aperture camera in the first target calibration information, so as to obtain a processed target image a 2; when the electronic equipment respectively collects initial images B1 and C1 through a multi-aperture camera and a depth camera in the camera module, the electronic equipment can correct the initial images B1 and C1 according to monocular calibration information respectively corresponding to the multi-aperture camera and the depth camera in the first target calibration information and binocular calibration information between the multi-aperture camera and the depth camera, and a processed target image B2 and a processed target image C2 are obtained.
According to the image processing method provided by the embodiment of the application, when the camera module collects the initial image, the collection aperture value corresponding to the camera module is obtained, the first target calibration information corresponding to the collection aperture value is obtained from the calibration information, and the initial image is corrected according to the first calibration information to obtain the target image. Because the corresponding calibration information can be acquired according to the acquired aperture value to process the initial image, the problems of inaccurate image center, unclear image and the like caused by processing the image by adopting the same calibration information are avoided, and the accuracy of image processing can be improved.
As shown in fig. 3, in an embodiment, the provided image processing method further includes a process of performing calibration processing on the camera module, which specifically includes:
step 302, sequentially acquiring different aperture values of the camera module, and shooting the calibration plate with the different aperture values through the camera module to obtain a set of calibration images corresponding to each aperture value, wherein the set of calibration images includes the calibration images shot by each camera in the camera module.
The electronic equipment acquires different aperture values of the camera module in sequence, specifically, the electronic equipment can acquire the aperture value of the camera with different aperture values in the camera module, the calibration plate is shot by the camera module with different aperture values, a set of calibration images corresponding to each aperture value is obtained, and the calibration images shot by each camera in the camera module are contained in the set of calibration images. A calibration plate refers to a pattern with an array of fixed pitch images. Specifically, the calibration plate may be a three-dimensional calibration plate with at least three calibration surfaces, or may be a flat plate with only one calibration surface. When the electronic equipment calibrates the camera, at least more than three calibration pictures are needed, therefore, when the calibration plate is a three-dimensional calibration plate, the electronic equipment shoots the calibration plate through the camera module at different aperture values, each group of obtained calibration pictures can contain one picture respectively collected by each camera in the camera module, and the calibration pictures contain three different calibration graphs; when the calibration plate is a flat plate with only one calibration surface, the electronic equipment shoots the calibration plate at least three angles through the camera module by using different aperture values, so that each group of calibration images can contain at least three images respectively collected by each camera in the camera module.
And 304, calibrating the camera module according to a group of calibration images corresponding to each aperture value to obtain calibration information corresponding to each aperture value of the camera module.
The calibration information refers to camera parameters obtained after the camera is calibrated. The calibration information can be used for correcting the image acquired by the camera, so that the corrected image can restore the object in the three-dimensional space. Specifically, the calibration information may include monocular calibration information of a single camera or binocular calibration information between two cameras, where the monocular calibration information includes internal parameters, external parameters, distortion coefficients, and the like of the cameras, and the binocular calibration information includes external parameters between the two cameras. The calibration processing refers to the operation of solving parameters in a geometric model imaged by the camera, and the shot image can restore an object in a three-dimensional space through the geometric model imaged by the camera. Specifically, the calibration process may include a monocular calibration process for finding internal parameters, external parameters, distortion coefficients, and the like of a single camera, and a binocular calibration process for finding external parameters between two cameras.
The electronic device calibrates the camera module according to a set of calibration images corresponding to each aperture value, and specifically, the electronic device may calibrate the camera module by using a conventional camera calibration method, a camera self-calibration method, a calibration method for a Zhang friend between the conventional calibration method and the self-calibration method, and the like, so as to obtain calibration information corresponding to the camera module. For example, when a set of calibration images corresponding to the first aperture value includes a calibration image E acquired by the multi-aperture camera, a calibration image F acquired by the black-and-white camera, and a calibration image G acquired by the depth camera, the electronic device performs calibration processing according to the calibration images E, F, and G, and the obtained calibration information corresponding to the camera module includes internal parameters, external parameters, distortion coefficients, and the like corresponding to the color camera, the black-and-white camera, and the depth camera, and external parameters between the color camera and the black-and-white camera, and external parameters between the color camera and the depth camera.
Through the different aperture values that acquire the camera module in proper order, shoot the calibration board with different aperture values through the camera module, obtain a set of demarcation image that corresponds with each aperture value, carry out the demarcation processing to the camera module according to a set of demarcation image that each aperture value corresponds, obtain the demarcation information that the camera module corresponds under each aperture value, avoid having the camera of a plurality of aperture values to do only once and mark the processing, improved the accuracy that the camera was markd. Furthermore, in the use process of the camera module, the corresponding calibration information can be acquired according to the adopted aperture value to process the image, and the accuracy of image processing can be improved.
Fig. 4 is a schematic diagram illustrating an electronic device performing calibration processing on a camera module in an embodiment. Fig. 4 includes an electronic device 110 and a calibration board 120. The calibration plate (chart) 120 may be a three-dimensional calibration plate with at least three calibration faces, each with a chart pattern thereon. The electronic device 110 may obtain different aperture values of the camera module, capture the calibration plate 120 with different aperture values through the camera module, obtain a set of calibration images collected by the camera module under each aperture value, and perform calibration processing on the camera module according to a set of calibration images corresponding to each aperture value, so as to obtain calibration information of the camera module.
As shown in fig. 5, in one embodiment, the image processing method provided in the present invention includes steps 502 to 504, in which the process of performing the correction processing on the initial image according to the first target calibration information includes steps of. Wherein:
step 502, performing monocular correction processing on initial images acquired by each camera in the camera module according to monocular calibration information contained in the first target calibration information to obtain an intermediate image.
The monocular calibration information is calibration information corresponding to a single camera, and specifically, the monocular calibration information includes internal parameters, external parameters, distortion coefficients and the like corresponding to the camera. The monocular correction processing is the operation of processing the image according to the single target information, so that the processed image can accurately restore the object in the three-dimensional space. The camera module may be a module including a plurality of cameras. When only one camera is contained in the camera module, the electronic equipment can directly carry out monocular correction processing on the initial image collected by the camera according to the monocular calibration information contained in the target calibration information to obtain the target image. When the camera module comprises two or more cameras, the electronic equipment can carry out monocular correction processing on initial images collected by the cameras in the camera module by monocular calibration information contained in the target calibration information to obtain intermediate images. For example, in the foregoing example, when the electronic device respectively acquires the initial images B1 and C1 through the multi-aperture camera and the depth camera in the camera module, the electronic device may perform monocular correction processing on the initial images B1 and C1 according to the monocular calibration information corresponding to the multi-aperture camera and the depth camera, respectively, to obtain the intermediate images B11 and C21.
And step 504, performing binocular correction processing on the intermediate image according to binocular calibration information contained in the first target calibration information to obtain target images respectively corresponding to the cameras.
The binocular calibration information is external parameters between any two cameras and is used for representing the relative distance between the two cameras. Specifically, the binocular calibration information may include a rotation matrix and a translation matrix between the two cameras. In one embodiment, the binocular calibration information corresponding to the camera module comprises external parameters between each camera and the multi-aperture camera. The binocular correction processing is an operation of aligning the two images after the binocular correction processing so that epipolar lines of the two images are on the same horizontal line. The electronic equipment can carry out binocular correction processing on the intermediate image according to the binocular calibration information contained in the first target calibration information to obtain target images corresponding to the initial images collected by the cameras respectively. For example, in the above example, after the electronic apparatus obtains the intermediate images B11 and C21, the intermediate images B11 and C21 are subjected to binocular correction processing based on binocular calibration information between the multi-aperture camera and the depth camera, and the target image B2 and the target image C2 are obtained.
The initial images collected by the cameras in the camera module are corrected according to the monocular calibration information contained in the first target calibration information to obtain the intermediate images, and the intermediate images are corrected according to the binocular calibration information contained in the first target calibration information to obtain the target images respectively corresponding to the cameras, so that the accuracy of image processing can be improved.
In an embodiment, in the provided image processing method, the camera module includes a first camera and a second camera, and the corrected target image includes a first target image acquired by the first camera and a second target image acquired by the second camera, and the image processing method may further include: obtaining a first depth image according to the first target image and the second target image; the first target image is processed according to the first depth image.
The first target image and the second target image have the same shooting scene. The first depth image includes depth information, which is a distribution of the subject in the first target image in the shooting scene. The depth information is information of a distance between the camera and a subject in a shooting scene. The electronic device obtains a first depth image according to the first target image and the second target image, and specifically, the electronic device may calculate a parallax between the first camera and the second camera according to the first target image and the second target image, convert the parallax between the first camera and the second camera into depth information of the object to be photographed, and obtain the first depth image formed by the depth information of the object to be photographed. In daily life, when people take images, the camera is close to the shooting main body for shooting, the electronic equipment processes the first target image according to the first depth image, and color enhancement, AR (Augmented Reality) processing and the like can be performed on an area with smaller depth information in the first target image; the electronic device may classify an area of the first target image with larger depth information as a background area, and perform blurring, background replacement, and the like on the background area, so as to highlight the subject in the image.
The first target image is processed according to the first target image and the second target image, namely the first target image is processed according to the depth information contained in the first target image, so that the accuracy of image processing can be improved, and the image processing effect is optimized.
As shown in fig. 6, in an embodiment, the camera module further includes a third camera, and the target image includes a third target image captured by the third camera, and the image processing method may further include steps 602 to 606, where:
step 602, a corresponding color histogram is established according to the first target image.
The color histogram refers to a graph constructed according to the color distribution of an image. The electronic equipment establishes a corresponding color histogram according to the first target image, specifically, the electronic equipment can further extract color parameters of each pixel point in the first target image according to a pre-stored color parameter range corresponding to different colors, determine the color of the pixel point according to the color parameters, count the number of the pixel points of each color in the target image, and establish the color histogram according to the color and the number of the corresponding pixel points; the occurrence frequency of the color can also be obtained according to the ratio of the number of the pixel points corresponding to the color to the number of all the pixel points in the first target image, and a color histogram is established according to the color in the first target image and the corresponding occurrence frequency. The color parameter may be determined using an RGB (Red, Green, Blue, Red, Green, Blue) color space, an HSB (hue, saturation, brightness) color space, or the like, but is not limited thereto.
Step 604, detecting the degree of dispersion of the color histogram.
The degree of dispersion of the color histogram refers to the degree of difference between the appearance frequencies of colors in the color histogram. The smaller the dispersion degree is, the smaller the difference of the occurrence frequency of each color in the first target image is, and the more uniform the distribution area of each color in the first target image is; the larger the degree of dispersion is, the larger the difference of the occurrence frequencies of the respective colors in the first target image is, the larger the difference of the distribution areas of the respective colors in the first target image is, and the larger areas with the same color appear in the first target image. The electronic device may detect the degree of dispersion of the color histogram by calculating a range of differences, mean differences, standard deviations, or variances.
And 606, when the discrete degree of the color histogram exceeds a discrete threshold value, processing the first target image according to a second depth image, wherein the second depth image is obtained according to a third target image.
The discrete threshold value can be set according to the actual application requirement. The discrete threshold is the discrete degree of the color histogram corresponding to the first target image when the first target image needs to be processed according to the second depth image. The second depth image is a third target image obtained after calibration processing of an initial image acquired by a third camera, specifically, the third camera may be a depth camera, for example, the third camera may be a TOF (Time of flight) camera or a structured light camera, and the like. The third camera and the first camera and the second camera form a camera module with three cameras. When the electronic equipment acquires the initial image through the camera module, a third initial image (depth image) is obtained through a third camera in the camera module, monocular correction processing is carried out on the third initial image to obtain a third intermediate image, binocular correction processing is carried out on the third intermediate image and the first intermediate image according to binocular calibration information between the first camera and the third camera, and the obtained third target image contains depth information corresponding to each pixel point in the first target image. The electronic device may use the third target image as the second depth image, or may process the third target image to obtain the second depth image, and process the first target image according to the second depth image, where the processing on the third target image may include, but is not limited to, adjusting depth information included in the third target image, or performing alignment processing on the third target image and another image.
The corresponding color histogram is established according to the first target image, the dispersion degree of the color histogram is detected, when the dispersion degree of the color histogram exceeds a dispersion threshold value, the first target image is processed according to the second depth image obtained after the calibration processing of the initial image collected by the third camera, the situation that accurate depth information cannot be obtained when the electronic equipment carries out binocular ranging according to the first target image and the second target image when the image texture is not clear or large-area areas with the same color exist can be avoided, the accuracy of the depth information can be guaranteed, and the accuracy of image processing is improved.
As shown in fig. 7, in one embodiment, the provided image processing method further includes steps 702 to 706, wherein:
step 702, when there is an invalid depth value in the second depth image, obtaining a first depth value corresponding to a pixel point corresponding to the invalid depth value in the first depth image.
The invalid depth value existing in the second depth image refers to a pixel point where the depth information does not exist or the depth information is 0 in the second depth image. When the electronic device acquires an image corresponding to the current scene, the color of the current scene affects the depth image acquired by the third camera, so that an invalid depth value exists in the acquired second depth image. For example, if an area with a distance from the camera exceeding the measurable distance of the third camera exists in the current scene, the pixel point of the area in the second depth image obtained according to the third camera is the invalid depth value. The electronic device may obtain a first depth value corresponding to a pixel point corresponding to an invalid depth value in the first depth image when the invalid depth value exists in the second depth image.
Step 704, replacing the invalid depth value in the second depth image with the first depth value to obtain the target depth image.
The target depth image is an image obtained by combining the depth information of the second depth image and the depth information of the first depth image. Specifically, the electronic device replaces the depth value of the pixel point in the second depth image with the first depth value according to the first depth value corresponding to the pixel point in the first depth image corresponding to the invalid depth value, so as to obtain the target depth image.
Step 706, the first target image is processed according to the target depth image.
The electronic device processes the first target image according to the target depth image, and specifically, the electronic device processes the first target image according to the depth information included in the target depth image. The target depth image comprises the depth information of the first depth image and the second depth image, and compared with the first depth image and the second depth image, the depth information in the target depth image is more accurate, so that the electronic equipment processes the first target image according to the target depth image, and the accuracy of image processing can be improved.
In an embodiment, the electronic device may further obtain a target application program initiating a collection instruction for collecting the initial image when the initial image is collected by the camera module, and obtain a collection aperture value of the camera module when it is determined that the target application program belongs to the first type of application program.
An application refers to a computer program that can accomplish certain tasks. Specifically, the application program has a function of calling a camera to acquire an image. For example, the Instagram may capture an image through a camera, the WeChat may capture an image through a camera, or scan a two-dimensional code image, the payment application, or the lock screen application may capture a face image through a camera for recognition, and the like, but are not limited thereto. The acquisition instruction can be generated by clicking a button on the display screen by a user, or generated by pressing a control on the touch screen by the user, and the like, and when the electronic equipment acquires the initial image through the camera module, the electronic equipment acquires a target application program initiating the acquisition instruction for acquiring the initial image. The electronic device may classify the application, and specifically, the electronic device may classify the application according to a requirement for image accuracy, a requirement for image processing speed, and the like. The first type of application may be an application that requires a higher image accuracy or a lower image processing speed. The electronic equipment can judge whether the target application program belongs to a first type of application program or not when the target application program corresponding to the acquisition instruction is acquired, acquire the acquisition aperture value of the camera module when the target application program belongs to the first type of application program, acquire first target calibration information corresponding to the acquisition aperture value, correct the initial image to obtain a target image, improve the accuracy of the target image, avoid the influence on the definition and the like of the image when the initial image is processed by adopting unified calibration information, and optimize the image processing effect.
In one embodiment, the provided image processing method may further include: and when the target application program is judged to belong to the second type of application program, acquiring preset second target calibration information, and correcting the initial image according to the second target calibration information to obtain a target image.
The camera usually has a default aperture value when in actual application, and when just starting the camera, the camera collects images through the default aperture value. The preset second target calibration information is calibration information corresponding to a default aperture value. The electronic device may categorize applications that capture images according to a default aperture value as a second type of application. The electronic device may also classify applications that require less accuracy in image processing or more speed in image processing as a second class of applications. When the target application program is judged to belong to the first type of application program, the electronic equipment acquires first target calibration information corresponding to the acquired aperture value, and corrects the initial image according to the first target calibration information; and when the target application program is judged to belong to the second type of application program, the electronic equipment corrects the initial image according to preset second target calibration information to obtain a target image.
The electronic equipment judges the classification of the target application program according to the target application program initiating the acquisition instruction, so that corresponding calibration information is obtained to correct the initial image, and when the target application program belongs to the second type of application program, the preset second target calibration information is directly adopted to process the initial image, so that the image processing efficiency can be improved.
It should be understood that although the various steps in the flowcharts of fig. 2, 3, 5-7 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 3, 5-7 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
Fig. 8 is a block diagram showing the configuration of an image processing apparatus according to an embodiment. As shown in fig. 8, the image processing apparatus includes an aperture value acquisition module 802, a calibration information acquisition module 804, and a correction processing module 806. Wherein:
and an aperture value obtaining module 802, configured to obtain a collection aperture value corresponding to the camera module when the initial image is collected through the camera module.
A calibration information obtaining module 804, configured to obtain first target calibration information corresponding to the acquired aperture value from the calibration information, where the calibration information is obtained by calibrating the camera module at different aperture values.
And a correction processing module 806, configured to perform correction processing on the initial image according to the first target calibration information to obtain a target image.
The image processing device provided by the embodiment of the application is used for acquiring the acquisition aperture value corresponding to the camera module when the initial image is acquired through the camera module, acquiring the first target calibration information corresponding to the acquisition aperture value from the calibration information, and correcting the initial image according to the first target calibration information to obtain the target image. The initial image can be processed by acquiring the corresponding calibration information according to the acquired aperture value, so that the accuracy of image processing can be improved.
In an embodiment, the provided image processing apparatus further includes a calibration processing module 808, where the calibration processing module 808 is configured to sequentially obtain different aperture values of the camera module, and capture the calibration plate with the different aperture values through the camera module to obtain a set of calibration images corresponding to each aperture value, where the set of calibration images includes calibration images captured by each camera in the camera module; and calibrating the camera module according to a group of calibration images corresponding to each aperture value to obtain calibration information corresponding to each aperture value of the camera module.
In an embodiment, the correction processing module 806 may be further configured to perform monocular correction processing on initial images acquired by each camera in the camera module according to monocular calibration information included in the first target calibration information to obtain an intermediate image; and performing binocular correction processing on the intermediate image according to binocular calibration information contained in the first target calibration information to obtain target images respectively corresponding to the cameras.
In one embodiment, the camera module includes a first camera and a second camera, the target image includes a first target image collected by the first camera and a second target image collected by the second camera, the image processing apparatus further includes an image processing module 810, and the image processing module 810 is configured to obtain a first depth image according to the first target image and the second target image; the first target image is processed according to the first depth image.
In one embodiment, the camera module further includes a third camera, the target image includes a third target image collected by the third camera, and the image processing module 810 may be further configured to establish a corresponding color histogram according to the first target image; detecting the dispersion degree of the color histogram; and when the discrete degree of the color histogram exceeds a discrete threshold value, processing the first target image according to a second depth image, wherein the second depth image is obtained according to a third target image.
In an embodiment, the image processing module 810 may be further configured to, when an invalid depth value exists in the second depth image, obtain a first depth value corresponding to a pixel point corresponding to the invalid depth value in the first depth image; replacing the invalid depth value in the second depth image with the first depth value to obtain a target depth image; and processing the first target image according to the target depth image.
In one embodiment, the correction processing module 806 may be further configured to, when acquiring an initial image through the camera module, obtain a target application program that initiates an acquisition instruction to acquire the initial image; when the target application program is judged to belong to the first type of application program, acquiring a collection aperture value of the camera module; and acquiring first target calibration information corresponding to the acquired aperture value from the calibration information, and correcting the initial image according to the first target calibration information to obtain a target image.
In an embodiment, the correction processing module 806 may be further configured to, when it is determined that the target application belongs to the second type of application, obtain preset second target calibration information, and perform correction processing on the initial image according to the second target calibration information to obtain the target image.
The division of the modules in the image processing apparatus is only for illustration, and in other embodiments, the image processing apparatus may be divided into different modules as needed to complete all or part of the functions of the image processing apparatus.
Fig. 9 is a schematic diagram of an internal structure of an electronic device in one embodiment. As shown in fig. 9, the electronic device includes a processor and a memory connected by a system bus. Wherein, the processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor to implement an image processing method provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc.
The implementation of each module in the image processing apparatus provided in the embodiment of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides the electronic equipment. The electronic device includes therein an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 10 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 10, for convenience of explanation, only aspects of the image processing technology related to the embodiments of the present application are shown.
As shown in fig. 10, the image processing circuit includes an ISP processor 1040 and control logic 1050. The image data captured by the imaging device 1010 is first processed by the ISP processor 1040, and the ISP processor 1040 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of the imaging device 1010. The imaging device 1010 may include a camera having one or more lenses 1012 and an image sensor 1014. The image sensor 1014 may include an array of color filters (e.g., Bayer filters), and the image sensor 1014 may acquire light intensity and wavelength information captured with each imaging pixel of the image sensor 1014 and provide a set of raw image data that may be processed by the ISP processor 1040. The sensor 1020 (e.g., a gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 1040 based on the type of sensor 1020 interface. The sensor 1020 interface may utilize an SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, the image sensor 1014 may also send raw image data to the sensor 1020, the sensor 1020 may provide the raw image data to the ISP processor 1040 based on the type of interface of the sensor 1020, or the sensor 1020 may store the raw image data in the image memory 1030.
The ISP processor 1040 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 1040 may perform one or more image processing operations on the raw image data, gathering statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 1040 may also receive image data from image memory 1030. For example, the sensor 1020 interface sends raw image data to the image memory 1030, and the raw image data in the image memory 1030 is then provided to the ISP processor 1040 for processing. The image Memory 1030 may be part of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from image sensor 1014 interface or from sensor 1020 interface or from image memory 1030, ISP processor 1040 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 1030 for additional processing before being displayed. ISP processor 1040 receives processed data from image memory 1030 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The image data processed by ISP processor 1040 may be output to display 1070 for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of ISP processor 1040 may also be sent to image memory 1030, and display 1070 may read image data from image memory 1030. In one embodiment, image memory 1030 may be configured to implement one or more frame buffers. Further, the output of the ISP processor 1040 may be transmitted to the encoder/decoder 1060 for encoding/decoding the image data. The encoded image data may be saved and decompressed before being displayed on a display 1070 device. The encoder/decoder 1060 may be implemented by a CPU or GPU or coprocessor.
The statistics determined by the ISP processor 1040 may be sent to the control logic 1050 unit. For example, the statistical data may include image sensor 1014 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 1012 shading correction, and the like. Control logic 1050 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of imaging device 1010 and ISP processor 1040 based on the received statistical data. For example, the control parameters of the imaging device 1010 may include sensor 1020 control parameters (e.g., gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 1012 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), and lens 1012 shading correction parameters.
The image processing method described above can be implemented using the image processing technique of fig. 10. In embodiments of the present application, the image processing circuitry may comprise one or more imaging devices.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the image processing method.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform an image processing method.
Any reference to memory, storage, database, or other medium used by embodiments of the present application may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An image processing method, comprising:
sequentially acquiring different aperture values of a camera module, and shooting a calibration plate by the camera module according to the different aperture values to obtain a group of calibration images corresponding to each aperture value, wherein the group of calibration images comprise calibration images shot by each camera in the camera module;
calibrating the camera module according to a group of calibration images corresponding to each aperture value to obtain calibration information corresponding to each aperture value of the camera module;
acquiring a collection aperture value corresponding to a camera module when an initial image is collected through the camera module;
acquiring first target calibration information corresponding to the acquired aperture value from calibration information, and acquiring the calibration information corresponding to the aperture value closest to the acquired aperture value as the first target calibration information when the calibration information corresponding to the acquired aperture value does not exist, wherein the calibration information is obtained by calibrating the camera module under different aperture values;
and correcting the initial image according to the first target calibration information to obtain a target image.
2. The method according to claim 1, wherein the performing correction processing on the initial image according to the first target calibration information comprises:
performing monocular correction processing on initial images acquired by each camera in the camera module according to monocular calibration information contained in the first target calibration information to obtain intermediate images;
and performing binocular correction processing on the intermediate image according to binocular calibration information contained in the first target calibration information to obtain target images respectively corresponding to the cameras.
3. The method of claim 1, wherein the camera module comprises a first camera and a second camera, and the target image comprises a first target image captured by the first camera and a second target image captured by the second camera; the method further comprises the following steps:
obtaining a first depth image according to the first target image and the second target image;
and processing the first target image according to the first depth image.
4. The method of claim 1, wherein the camera module comprises a first camera and a third camera, and the target image comprises a first target image captured by the first camera and a third target image captured by the third camera; the method further comprises the following steps:
establishing a corresponding color histogram according to the first target image;
detecting a degree of dispersion of the color histogram;
and when the discrete degree of the color histogram exceeds a discrete threshold value, processing the first target image according to a second depth image, wherein the second depth image is obtained according to the third target image.
5. The method of claim 4, wherein the camera module further comprises a second camera, the target image further comprises a second target image captured by the second camera, and the processing the first target image according to the second depth image comprises:
when an invalid depth value exists in the second depth image, acquiring a first depth value corresponding to a pixel point corresponding to the invalid depth value in a first depth image, wherein the first depth image is obtained according to the first target image and a second target image;
replacing the invalid depth value in the second depth image with the first depth value to obtain a target depth image;
and processing the first target image according to the target depth image.
6. The method according to claim 1, wherein the acquiring an aperture value corresponding to the camera module when acquiring the initial image by the camera module comprises:
when an initial image is collected through the camera module, a target application program initiating a collection instruction for collecting the initial image is obtained;
when the target application program is judged to belong to a first type of application program, acquiring the acquisition aperture value of the camera module;
the method further comprises the following steps:
and when the target application program is judged to belong to the second type of application program, acquiring preset second target calibration information, and correcting the initial image according to the second target calibration information to obtain a target image.
7. An image processing apparatus characterized by comprising:
the calibration processing module is used for sequentially acquiring different aperture values of the camera module, and shooting the calibration plate by the camera module according to the different aperture values to obtain a group of calibration images corresponding to each aperture value, wherein the group of calibration images comprise the calibration images shot by each camera in the camera module; calibrating the camera module according to a group of calibration images corresponding to each aperture value to obtain calibration information corresponding to each aperture value of the camera module;
the camera module is used for acquiring an initial image and acquiring a collection aperture value corresponding to the camera module;
the calibration information acquisition module is used for acquiring first target calibration information corresponding to the acquired aperture value from calibration information, and acquiring the calibration information corresponding to the aperture value closest to the acquired aperture value as the first target calibration information when the calibration information corresponding to the acquired aperture value does not exist, wherein the calibration information is obtained by calibrating the camera module under different aperture values;
and the correction processing module is used for correcting the initial image according to the first target calibration information to obtain a target image.
8. The apparatus according to claim 7, wherein the correction processing module is further configured to perform monocular correction processing on initial images acquired by each camera in the camera module according to monocular calibration information included in the first target calibration information to obtain an intermediate image; and performing binocular correction processing on the intermediate image according to binocular calibration information contained in the first target calibration information to obtain target images respectively corresponding to the cameras.
9. An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of the image processing method according to any one of claims 1 to 6.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN201811454206.6A 2018-11-30 2018-11-30 Image processing method, image processing device, electronic equipment and computer readable storage medium Active CN109685853B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811454206.6A CN109685853B (en) 2018-11-30 2018-11-30 Image processing method, image processing device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811454206.6A CN109685853B (en) 2018-11-30 2018-11-30 Image processing method, image processing device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN109685853A CN109685853A (en) 2019-04-26
CN109685853B true CN109685853B (en) 2021-02-02

Family

ID=66185634

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811454206.6A Active CN109685853B (en) 2018-11-30 2018-11-30 Image processing method, image processing device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN109685853B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110175960B (en) * 2019-05-21 2021-04-13 Oppo广东移动通信有限公司 Image correction method, image correction device, electronic device and storage medium
CN110276831B (en) * 2019-06-28 2022-03-18 Oppo广东移动通信有限公司 Method and device for constructing three-dimensional model, equipment and computer-readable storage medium
CN111050027B (en) * 2019-12-31 2021-12-28 华兴源创(成都)科技有限公司 Lens distortion compensation method, device, equipment and storage medium
CN113766090B (en) * 2020-06-02 2023-08-01 武汉Tcl集团工业研究院有限公司 Image processing method, terminal and storage medium
CN113132626B (en) * 2021-03-26 2022-05-31 联想(北京)有限公司 Image processing method and electronic equipment
CN113251951B (en) * 2021-04-26 2024-03-01 湖北汽车工业学院 Calibration method of line structured light vision measurement system based on single calibration surface mapping

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4517460A (en) * 1981-06-24 1985-05-14 U.S. Philips Corporation Method of calibrating a gamma camera, and a gamma camera including a calibration device
US5606392A (en) * 1996-06-28 1997-02-25 Eastman Kodak Company Camera using calibrated aperture settings for exposure control
CN101140661A (en) * 2007-09-04 2008-03-12 杭州镭星科技有限公司 Real time object identification method taking dynamic projection as background
CN101674443A (en) * 2009-09-08 2010-03-17 长春理工大学 Method for correcting colors of projector
CN107566720A (en) * 2017-08-25 2018-01-09 维沃移动通信有限公司 A kind of method and mobile terminal of the calibration value for updating mobile terminal
CN108305233A (en) * 2018-03-06 2018-07-20 哈尔滨工业大学 A kind of light field image bearing calibration for microlens array error

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3395066B1 (en) * 2015-12-25 2022-08-03 BOE Technology Group Co., Ltd. Depth map generation apparatus, method and non-transitory computer-readable medium therefor
CN105979244A (en) * 2016-05-31 2016-09-28 十二维度(北京)科技有限公司 Method and system used for converting 2D image to 3D image based on deep learning
CN106651870B (en) * 2016-11-17 2020-03-24 山东大学 Segmentation method of image out-of-focus fuzzy region in multi-view three-dimensional reconstruction
KR102133017B1 (en) * 2017-03-29 2020-07-10 한국전자통신연구원 Projector and calibration method for projector
CN108055471B (en) * 2017-11-16 2020-05-15 深圳市维海德技术股份有限公司 Aperture correction method and device
CN111126146B (en) * 2018-04-12 2024-03-05 Oppo广东移动通信有限公司 Image processing method, image processing device, computer readable storage medium and electronic apparatus
CN108760246B (en) * 2018-05-25 2020-01-07 上海复瞻智能科技有限公司 Method for detecting eye movement range in head-up display system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4517460A (en) * 1981-06-24 1985-05-14 U.S. Philips Corporation Method of calibrating a gamma camera, and a gamma camera including a calibration device
US5606392A (en) * 1996-06-28 1997-02-25 Eastman Kodak Company Camera using calibrated aperture settings for exposure control
CN101140661A (en) * 2007-09-04 2008-03-12 杭州镭星科技有限公司 Real time object identification method taking dynamic projection as background
CN101674443A (en) * 2009-09-08 2010-03-17 长春理工大学 Method for correcting colors of projector
CN107566720A (en) * 2017-08-25 2018-01-09 维沃移动通信有限公司 A kind of method and mobile terminal of the calibration value for updating mobile terminal
CN108305233A (en) * 2018-03-06 2018-07-20 哈尔滨工业大学 A kind of light field image bearing calibration for microlens array error

Also Published As

Publication number Publication date
CN109685853A (en) 2019-04-26

Similar Documents

Publication Publication Date Title
CN109767467B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109685853B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107948519B (en) Image processing method, device and equipment
KR102293443B1 (en) Image processing method and mobile terminal using dual camera
CN109089047B (en) Method and device for controlling focusing, storage medium and electronic equipment
EP3480783B1 (en) Image-processing method, apparatus and device
CN109712192B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium
US11431915B2 (en) Image acquisition method, electronic device, and non-transitory computer readable storage medium
CN107704798B (en) Image blurring method and device, computer readable storage medium and computer device
CN107481186B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN110290323B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109559352B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN109963080B (en) Image acquisition method and device, electronic equipment and computer storage medium
CN108053438B (en) Depth of field acquisition method, device and equipment
CN112004029B (en) Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium
CN107395991B (en) Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment
CN108156369B (en) Image processing method and device
CN110866486B (en) Subject detection method and apparatus, electronic device, and computer-readable storage medium
CN107959841B (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN107948617B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN107563979B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN109598763B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN111246100B (en) Anti-shake parameter calibration method and device and electronic equipment
CN108401110B (en) Image acquisition method and device, storage medium and electronic equipment
CN109559353B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant