CN107846556B - Imaging method, imaging device, mobile terminal and storage medium - Google Patents

Imaging method, imaging device, mobile terminal and storage medium Download PDF

Info

Publication number
CN107846556B
CN107846556B CN201711243972.3A CN201711243972A CN107846556B CN 107846556 B CN107846556 B CN 107846556B CN 201711243972 A CN201711243972 A CN 201711243972A CN 107846556 B CN107846556 B CN 107846556B
Authority
CN
China
Prior art keywords
exposure
camera
main
image
main image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711243972.3A
Other languages
Chinese (zh)
Other versions
CN107846556A (en
Inventor
张弓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201711243972.3A priority Critical patent/CN107846556B/en
Publication of CN107846556A publication Critical patent/CN107846556A/en
Application granted granted Critical
Publication of CN107846556B publication Critical patent/CN107846556B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application provides an imaging method, an imaging device, a mobile terminal and a storage medium, wherein the method comprises the following steps: acquiring a main image acquired by a main camera and acquiring an auxiliary image acquired by an auxiliary camera; determining an exposure adjustment area of the main image and corresponding exposure compensation information according to exposure information obtained by photometry; acquiring the depth information of the main image according to the main image and the auxiliary image; determining a foreground area and a background area of the main image according to the depth information of the main image; in the overlapping part of the foreground area and the exposure adjustment area, carrying out exposure adjustment on the main image according to the exposure compensation information corresponding to the exposure adjustment area; and blurring the main image after exposure adjustment in the background area to obtain an imaging image. The method can improve the exposure effect of the imaging image on one hand, and improve the accuracy of the depth information on the other hand, so that the image processing effect is better, and the technical problem of poor image exposure effect in the prior art is solved.

Description

Imaging method, imaging device, mobile terminal and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an imaging method, an imaging apparatus, a mobile terminal, and a storage medium.
Background
With the continuous development of mobile terminal technology, more and more users use mobile terminals with dual-camera functions to take pictures. When a user opens the two cameras and enters a portrait mode for preview, if a viewing scene has a high dynamic range, the condition of local area underexposure and/or local area overexposure may occur in a preview picture, so that the photographing effect is poor. For example, in photographing a person, if problems of local underexposure and local overexposure occur, it is difficult to recognize the details of the face of the person, resulting in a photographed image that is far from what the user desires.
In the prior art, exposure parameters of the whole image are determined, and then exposure processing is performed on the image according to the exposure parameters. In this way, when the difference between the exposure parameters of different areas in the image is large, an underexposed area or an overexposed area appears in the image, and the exposure effect is poor.
Content of application
The present application is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, the application provides an imaging method, an exposure adjustment area and corresponding exposure compensation information of a main image are determined through pre-photometry, so that exposure adjustment is performed on the main image according to the exposure compensation information corresponding to the exposure adjustment area, an overexposure area and an underexposure area in an imaging image can be effectively avoided, and the exposure effect of the imaging image is improved. Meanwhile, the auxiliary image shot by the auxiliary camera and the main image shot by the main camera are shot synchronously, so that the main image after exposure adjustment is subjected to subsequent blurring processing according to the corresponding auxiliary image, on one hand, the imaging effect of the imaging picture is improved, on the other hand, the accuracy of the depth information is improved, the image processing effect is better, and the technical problem that the existing image exposure effect is poor is solved.
The application provides an imaging device.
The application provides a mobile terminal.
The present application provides a computer-readable storage medium.
To achieve the above object, an embodiment of a first aspect of the present application provides an imaging method, including:
acquiring a main image acquired by a main camera and acquiring an auxiliary image acquired by an auxiliary camera;
determining an exposure adjustment area of the main image and corresponding exposure compensation information according to exposure information obtained by photometry;
acquiring depth information of the main image according to the main image and the auxiliary image;
determining a foreground area and a background area of the main image according to the depth information of the main image;
performing exposure adjustment on the main image according to exposure compensation information corresponding to the exposure adjustment area at the overlapped part of the foreground area and the exposure adjustment area;
and blurring the main image after exposure adjustment in the background area to obtain an imaging image.
According to the imaging method, the main image acquired by the main camera and the auxiliary image acquired by the auxiliary camera are acquired, and then the exposure adjusting area of the main image and the corresponding exposure compensation information are determined according to the exposure information acquired by photometry. And then according to the main image and the auxiliary image, acquiring the depth information of the main image, finally determining a foreground area and a background area of the main image according to the depth information of the main image, carrying out exposure adjustment on the main image according to exposure compensation information corresponding to the exposure adjustment area at the overlapping part of the foreground area and the exposure adjustment area, and carrying out blurring processing on the main image after the exposure adjustment in the background area to obtain an imaging image. In this embodiment, the exposure adjustment region and the corresponding exposure compensation information of the main image are determined by pre-measuring light, so that the exposure adjustment of the main image is performed according to the exposure compensation information corresponding to the exposure adjustment region, thereby effectively avoiding an overexposure region and an underexposure region from occurring in the imaging image and improving the exposure effect of the imaging image. Meanwhile, the auxiliary image shot by the auxiliary camera and the main image shot by the main camera are shot synchronously, so that the main image after exposure adjustment is subjected to subsequent blurring processing according to the corresponding auxiliary image, on one hand, the imaging effect of the imaging picture is improved, on the other hand, the accuracy of the depth information is improved, the image processing effect is better, and the technical problem of poor image exposure effect in the prior art is solved.
To achieve the above object, an embodiment of a second aspect of the present application provides an image forming apparatus, including:
the acquisition module is used for acquiring a main image acquired by the main camera and acquiring an auxiliary image acquired by the auxiliary camera;
the first determining module is used for determining an exposure adjusting area of the main image and corresponding exposure compensation information according to exposure information obtained by photometry;
the depth-of-field module is used for acquiring the depth information of the main image according to the main image and the auxiliary image;
the first processing module is used for determining a foreground area and a background area of the main image according to the depth information of the main image;
the exposure module is used for carrying out exposure adjustment on the main image at the overlapped part of the foreground area and the exposure adjustment area according to the exposure compensation information corresponding to the exposure adjustment area;
and the blurring module is used for blurring the main image after exposure adjustment in the background area to obtain an imaging image.
According to the imaging device, the main image acquired by the main camera and the auxiliary image acquired by the auxiliary camera are acquired, and then the exposure adjusting area of the main image and the corresponding exposure compensation information are determined according to the exposure information acquired by photometry. And then according to the main image and the auxiliary image, acquiring the depth information of the main image, finally determining a foreground area and a background area of the main image according to the depth information of the main image, carrying out exposure adjustment on the main image according to exposure compensation information corresponding to the exposure adjustment area at the overlapping part of the foreground area and the exposure adjustment area, and carrying out blurring processing on the main image after the exposure adjustment in the background area to obtain an imaging image. In this embodiment, the exposure adjustment region and the corresponding exposure compensation information of the main image are determined by pre-measuring light, so that the exposure adjustment of the main image is performed according to the exposure compensation information corresponding to the exposure adjustment region, thereby effectively avoiding an overexposure region and an underexposure region from occurring in the imaging image and improving the exposure effect of the imaging image. Meanwhile, the auxiliary image shot by the auxiliary camera and the main image shot by the main camera are shot synchronously, so that the main image after exposure adjustment is subjected to subsequent blurring processing according to the corresponding auxiliary image, on one hand, the imaging effect of the imaging picture is improved, on the other hand, the accuracy of the depth information is improved, the image processing effect is better, and the technical problem of poor exposure effect in the prior art is solved.
To achieve the above object, a third aspect of the present application provides a mobile terminal, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the imaging method according to the embodiments of the first aspect of the present application when executing the program.
In order to achieve the above object, a fourth aspect of the present application provides a computer-readable storage medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the imaging method according to the first aspect of the present application.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow chart of a first imaging method provided in an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating the partitioning of a main image according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of triangulation;
FIG. 4 is a schematic flow chart of a second imaging method provided by an embodiment of the present application;
FIG. 5 is a schematic flow chart of a third imaging method provided by an embodiment of the present application;
fig. 6 is a schematic structural diagram of an imaging device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of another imaging device provided in an embodiment of the present application;
fig. 8 is a schematic structural diagram of a terminal device according to another embodiment of the present application;
FIG. 9 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
An imaging method, an apparatus, a mobile terminal, and a storage medium of embodiments of the present application are described below with reference to the accompanying drawings.
The imaging method can be specifically executed by hardware equipment such as a mobile phone with two cameras, a tablet personal computer, a personal digital assistant and wearable equipment, wherein the hardware equipment with the two cameras comprises a camera module, and the camera module comprises a main camera and an auxiliary camera. The main camera and the auxiliary camera are respectively provided with a lens, an image sensor and a voice coil motor which are independent. The main camera and the auxiliary camera in the double cameras are connected with the camera connector, so that the voice coil motor is driven according to the current value provided by the camera connector, the distance between the lens and the image sensor is adjusted under the driving of the voice coil motor by the main camera and the auxiliary camera, and focusing is achieved.
As a possible application scenario, the resolution of the secondary camera is lower than that of the primary camera, when focusing is performed, only the secondary camera may be used for focusing, when the secondary camera is focused, a second driving current value of a motor of the secondary camera is obtained, and further, under the condition that the primary camera and the secondary camera have the same focusing distance, a first driving current value of the motor of the primary camera is determined according to the second driving current value, and the primary camera is driven by the first driving current value for focusing. Because the resolution ratio of the auxiliary camera is low, the image processing speed is high, the focusing speed can be accelerated, and the technical problem that the focusing speed of the double cameras is low in the prior art is solved.
In the specific realization process of two cameras, can select different camera combinations as main camera and vice camera in two cameras to adapt to different user's demands.
In an application scenario, a higher focusing speed is required, so that a main camera in the dual cameras is specifically a common camera, and a sub camera in the dual cameras is specifically a dual Pixel (PD) camera. The resolution of the double PD camera is lower than that of a common camera, so that the double PD camera has a faster focusing speed.
It should be noted that each pixel of the dual PD camera is composed of two units, and the two units can be used as phase focusing detection points, and can also be combined to form an image of one pixel, thereby greatly improving the focusing performance during electronic framing. A dual PD Complementary Metal Oxide Semiconductor (CMOS), a sensor camera is a commonly used dual PD camera that specifically uses CMOS as a sensor, and is originally used in a single lens reflex.
In another application scenario, a better imaging effect is required, so that the combination of the wide-angle camera and the telephoto camera is used as a dual camera. And switching the main camera and the auxiliary camera according to the shooting requirement. Specifically, when a close-up scene is shot, a wide-angle lens is used as a main camera, and a telephoto lens is used as a sub-camera; when a long shot is taken, the telephoto lens is used as the main camera, and the wide-angle lens is used as the auxiliary camera, so that the optical zooming function is realized, and the imaging quality and the subsequent blurring effect are also ensured.
In the selection of the specific dual cameras, there may be multiple possible implementation manners, which is not described in detail in this embodiment.
Fig. 1 is a schematic flowchart of a first imaging method according to an embodiment of the present disclosure.
As shown in fig. 1, the imaging method includes the steps of:
step 101, acquiring a main image acquired by a main camera and acquiring a secondary image acquired by a secondary camera.
In this embodiment, in scenes such as a high dynamic range, the span of the luminance range is large in the framing picture of the dual cameras, and it is easy to occur that a part of the area is underexposed and another part of the area is overexposed, which results in poor shooting effect.
In such a scenario, the ambient brightness may be determined in advance from the average brightness of the plurality of light measuring points, and the main camera and the sub-camera may be determined from the two cameras.
Specifically, since the light is insufficient when the ambient brightness is not higher than the threshold brightness, if a high-resolution camera is used as the main camera to take a picture, more noise may occur, resulting in poor imaging effect. Therefore, in the embodiment, when the light is insufficient, the high-sensitivity camera can be used as the main camera, and the high-resolution camera can be used as the sub-camera to take a picture, so that the noise in the image is reduced, and the imaging effect is improved. On the contrary, under the circumstances that ambient brightness is higher than threshold value luminance, under the circumstances that light is sufficient promptly, because the camera resolution ratio of high resolution is higher, it is also comparatively clear to form an image, and the noise is less, consequently, in this embodiment, can regard as main camera with the camera of high resolution, and high sensitization camera is taken a photograph as vice camera to improve the formation of image effect.
After the main camera and the auxiliary camera are determined, the main camera and the auxiliary camera can be adopted for framing shooting at the same time, and a main image and a picture are obtained respectively.
The imaged image may be previewed prior to being captured. As a possible implementation manner, only the picture acquired by the main camera can be previewed, and when the user sees a satisfactory preview picture, the photographing key is clicked, so that the main camera and the auxiliary camera are controlled to perform framing photographing simultaneously. Or, the image acquired by the sub-camera can be previewed, and when the user sees a satisfactory preview image, the user clicks the photographing key, so that the main camera and the sub-camera are controlled to perform framing photographing at the same time, which is not limited.
Step 102, determining an exposure adjustment area of the main image and corresponding exposure compensation information according to exposure information obtained by photometry.
In the embodiment of the present invention, the exposure information is a plurality of exposure values obtained by matrix photometry, and the exposure values of a plurality of light measuring points arranged in a matrix in the main image may be determined according to the exposure information.
As a possible implementation manner, the main image may be divided into a plurality of units according to the framing picture continuity of the main image, for example, referring to fig. 2, where fig. 2 is a schematic diagram of dividing the main image in the embodiment of the present invention. The main image may be divided into nine cells, respectively cell A, B, C, D, E, F, G, H, I, according to the through-view continuity of the main image.
The exposure value of a cell may then be determined based on the exposure values of the light measuring points contained in each cell, for example, the exposure value of the cell may be determined by taking the average of the exposure values of all the light measuring points of the cell, or the exposure value of the cell may be determined by calculating the sum of all the light measuring points of the cell, or the exposure value of the cell may be calculated according to any other algorithm, which is not limited. After determining the exposure value of the cell, an exposure adjustment area of the main image and corresponding exposure compensation information may be determined according to the exposure value of the cell, wherein the exposure compensation information may be, for example, an exposure value that needs to be compensated.
It can be understood that, when the exposure is appropriate, the exposure value of each unit should be within a preset value range, which can be preset for a built-in program of the mobile terminal, for example, the preset value range is marked as [ a, b ]. When the exposure value of the unit is not within the preset value range, the unit can be determined as an exposure adjustment area. For example, when the exposure value of a cell is lower than the minimum value a of the preset value range, it indicates that the cell is under-exposed, and when the exposure compensation information is the exposure value that needs to be compensated, the exposure compensation information may be a positive value. And when the exposure value of the unit is higher than the maximum value b of the preset value range, the unit is indicated to be over-exposed, and at the moment, the exposure compensation information can be a negative value.
Step 103, acquiring the depth information of the main image according to the main image and the auxiliary image.
Specifically, since the main camera and the sub camera have a certain distance therebetween, and thus the two cameras have parallax, images taken by different cameras should be different. The main image is captured by the main camera and the web image is captured by the sub-camera, so that there should be some difference between the main image and the sub-image. According to the principle of triangulation, the depth information of the same object in the main image and the auxiliary image, namely the distance between the object and the plane where the main camera and the auxiliary camera are located, can be calculated.
For clarity of explanation of this process, the principle of triangulation will be briefly described below.
In actual scenes, the depth of the scene resolved by human eyes is mainly resolved by binocular vision. This is the same principle as the two cameras resolving depth. In this embodiment, the depth information of the imaging image is calculated according to the second captured image, the main method is based on the principle of triangulation, and fig. 3 is a schematic diagram of the principle of triangulation.
Based on FIG. 3, in the real space, the imaging object is drawn, and the positions O of the two cameras are shownRAnd OTAnd focal planes of the two cameras, wherein the distance between the focal planes and the plane where the two cameras are located is f, and the two cameras perform imaging at the focal planes, so that two shot images are obtained.
P and P' are the positions of the same subject in different captured images, respectively. Wherein the distance from the P point to the left boundary of the shot image is XRThe distance of the P' point from the left boundary of the shot image is XT。ORAnd OTThe two cameras are respectively arranged on the same plane, and the distance is B.
Based on the principle of triangulation, the distance Z between the object and the plane where the two cameras are located in fig. 3 has the following relationship:
Figure BDA0001490435780000061
based on this, can be derived
Figure BDA0001490435780000062
Where d is a distance difference between positions of the same object in different captured images. B, f is constant, so the distance Z of the object can be determined from d.
And 104, determining a foreground area and a background area of the main image according to the depth information of the main image.
Specifically, after the depth information of the main image is obtained by calculation, it may be determined whether the object is a foreground or a background according to the depth information of each object in the main image. Generally, the depth information indicates that the object is closer to the plane where the primary camera and the secondary camera are located, and when the depth value is smaller, the object can be determined to be the foreground, otherwise, the object is the background. The foreground region and the background region of the main image can be further determined according to each object in the main image.
And 105, carrying out exposure adjustment on the main image according to the exposure compensation information corresponding to the exposure adjustment area at the overlapped part of the foreground area and the exposure adjustment area.
In the embodiment of the present invention, since the exposure value of the exposure adjustment area may be lower than the minimum value a of the preset value range or higher than the maximum value b of the preset value range, the number of the exposure adjustment areas may be multiple. Therefore, in the embodiment of the present invention, after the exposure adjustment area of the main image is determined, the overlapping portion of each exposure adjustment area in the foreground area may be determined, and then the exposure adjustment of the main image may be performed in the overlapping portion of each exposure adjustment area according to the exposure compensation information corresponding to the exposure adjustment area.
As a possible implementation manner, one or more of the brightness, the contrast, and the color saturation may be adjusted according to the exposure compensation information corresponding to the exposure adjustment region.
For example, when the exposure value of the overlapping portion of the exposure adjustment region is lower than the minimum value a of the preset value range, the overlapping portion may be subjected to brightness-up processing, or brightness, contrast, and color saturation-up processing may be simultaneously performed. When the exposure value of the overlapped part of the exposure adjustment area is higher than the maximum value b of the preset value range, the overlapped part can be subjected to brightness reduction processing, or brightness, contrast and color saturation reduction processing can be simultaneously carried out.
And 106, blurring the main image after exposure adjustment in the background area to obtain an imaging image.
In the embodiment of the invention, the main image after exposure adjustment is subjected to blurring processing in the background area to obtain an imaging image, wherein the foreground is more prominent in the imaging image, and the background is blurred to present the imaging effect of the focusing foreground. In addition, in the embodiment of the invention, only the background area is subjected to blurring processing, so that the exposure condition of the background area has little influence on the imaging effect, the background area is not required to be subjected to exposure processing, and the processing efficiency of the mobile terminal is effectively improved.
In the imaging method of this embodiment, after the main image acquired by the main camera and the sub-image acquired by the sub-camera are acquired, the exposure adjustment area of the main image and the corresponding exposure compensation information are determined according to the exposure information obtained by photometry. And then according to the main image and the auxiliary image, acquiring the depth information of the main image, finally determining a foreground area and a background area of the main image according to the depth information of the main image, carrying out exposure adjustment on the main image according to exposure compensation information corresponding to the exposure adjustment area at the overlapping part of the foreground area and the exposure adjustment area, and carrying out blurring processing on the main image after the exposure adjustment in the background area to obtain an imaging image. In this embodiment, the exposure adjustment region and the corresponding exposure compensation information of the main image are determined by pre-measuring light, so that the exposure adjustment of the main image is performed according to the exposure compensation information corresponding to the exposure adjustment region, thereby effectively avoiding an overexposure region and an underexposure region from occurring in the imaging image and improving the exposure effect of the imaging image. Meanwhile, the auxiliary image shot by the auxiliary camera and the main image shot by the main camera are shot synchronously, so that the main image after exposure adjustment is subjected to subsequent blurring processing according to the corresponding auxiliary image, on one hand, the imaging effect of the imaging picture is improved, on the other hand, the accuracy of the depth information is improved, and the image processing effect is better.
For clarity of the above embodiment, this embodiment provides another imaging method, and fig. 4 is a schematic flow chart of the second imaging method provided in this embodiment of the present application.
As shown in fig. 4, before step 101, the imaging method may further include the steps of:
step 201, controlling the main camera and/or the auxiliary camera to measure light to obtain exposure information.
In the embodiment of the invention, the camera can be controlled in advance to perform photometry, so that the exposure information of the current scene of the mobile terminal is obtained. The Exposure information includes an Exposure Value (EV), which is generally about-3 EV to 3EV, and if the ambient light source is dark, the Exposure Value is large; conversely, the environment is brighter and the exposure value is smaller.
Note that EV is an amount reflecting how much exposure is, and is initially defined as: when the sensitivity is ISO 100, the aperture ratio is F1, and the exposure time is 1 second, the exposure amount is defined as 0, the exposure amount is decreased by one step (the shutter time is decreased by half or the aperture is decreased by one step), and the EV value is increased by 1. Each increase of 1 in the exposure value changes the exposure by one step, i.e. the exposure is halved, e.g. the exposure time or the aperture area is halved. A bright environment or higher sensitivity should correspond to a larger exposure value.
Step 202, determining exposure parameters according to the exposure information.
In the embodiment of the present invention, the exposure parameters include shutter speed, f-number and/or sensitivity.
As a possible implementation manner, the corresponding relationship between different exposure information and exposure parameters may be pre-established, so that after the exposure information is determined, the exposure parameters corresponding to the exposure information may be obtained by querying the corresponding relationship between the exposure information and the exposure parameters.
Step 203, respectively acquiring a main preview screen for previewing the target shooting object from the main camera and a sub preview screen for previewing the target shooting object from the sub camera.
In the embodiment of the invention, before shooting, the imaging image can be previewed. For example, the pictures captured by the main camera or the sub-camera may be previewed separately, and then, a main preview picture for previewing the target photographic subject may be acquired from the main camera and a sub-preview picture for previewing the target photographic subject may be acquired from the sub-camera separately.
Step 204, determining the depth information of the main preview picture according to the main preview picture and the auxiliary preview picture.
Specifically, since the main camera and the sub camera have a certain distance therebetween, and thus the two cameras have parallax, the preview pictures acquired from the different cameras for previewing the target photographic subject should be different. The main preview screen is obtained from the main camera, and the sub preview screen is obtained from the sub camera, so that a certain difference should exist between the main preview screen and the sub preview screen. According to the principle of triangulation, the depth information of the same object in the main preview picture and the auxiliary preview picture, namely the distance between the object and the plane where the main camera and the auxiliary camera are located, can be calculated.
Step 205, according to the exposure information, determining whether the target object is in an overexposure state, if so, executing step 206, otherwise, executing step 208.
It is to be understood that when the target photographic subject is dark, the exposure value of the corresponding target photographic subject in the main image is lower than a preset threshold, and at this time, it may be determined that the target photographic subject is in an underexposure state, and when the target photographic subject is bright, the exposure value of the corresponding target photographic subject in the main image is higher than the preset threshold, and at this time, it may be determined that the target photographic subject is in an overexposure state. The preset threshold may be preset for a built-in program of the mobile terminal, for example, the preset threshold is marked as c.
And step 206, determining a negative exposure compensation value required by the target shooting object according to the exposure information.
In the embodiment of the invention, when the target shooting object is in an overexposure state, in this case, in order to avoid an overexposure area of an image shot by the mobile terminal, the shooting object can be shielded. Specifically, a negative exposure compensation value required for the target photographic subject may be first determined.
As one possible implementation, the distance of the target photographic subject from the main camera may be determined according to the depth information of the target photographic subject in the main preview screen. Alternatively, the correspondence between different distances and exposure compensation values may be established in advance, so that after the distances are determined, the correspondence may be queried to obtain the exposure compensation value corresponding thereto, and then the obtained exposure compensation value is used as the negative exposure compensation value required by the target photographic object.
And step 207, shading the target shooting object according to the negative exposure compensation value.
Optionally, after the negative exposure compensation value is determined, the target photographic object may be shielded from light according to the negative exposure compensation value, so as to avoid an overexposure area in the main image when the target photographic object is subsequently photographed. Alternatively, when the target photographic object is in an overexposed state, the target photographic object may be shielded by a light shielding plate, which is not limited in this embodiment of the present invention.
And step 208, supplementing light to the target shooting object according to the depth information and the exposure information of the main preview picture.
In the embodiment of the invention, when the target shooting object is in an underexposure state, in this case, in order to avoid an underexposure area of an image shot by the mobile terminal, light supplement processing can be performed on the shooting object. Specifically, a forward exposure compensation value required for the target photographic subject may be first determined.
As one possible implementation, the distance of the target photographic subject from the main camera may be determined according to the depth information of the target photographic subject in the main preview screen. Alternatively, the correspondence between different distances and exposure compensation values may be established in advance, so that after the distances are determined, the correspondence may be queried to obtain the exposure compensation value corresponding thereto, and then the obtained exposure compensation value is used as the forward exposure compensation value required by the target photographic subject.
After the forward exposure compensation value required by the target shooting object is determined, the target shooting object can be supplemented with light according to the forward exposure compensation value, so that the situation that an underexposure area appears in a main image when the target shooting object is shot subsequently is avoided.
And step 209, controlling the main camera and the auxiliary camera to acquire images according to the exposure parameters.
Optionally, after the exposure parameters are determined, the main camera and the auxiliary camera may be controlled to perform image acquisition according to the exposure parameters.
In the imaging method of the embodiment, when the target photographic object is in an overexposure state, the negative exposure compensation value required by the target photographic object is determined according to the exposure information, and the target photographic object is shielded from light according to the negative exposure compensation value, so that the situation that an overexposure area appears in a main image when the target photographic object is subsequently photographed can be avoided. When the target shooting object is in an underexposure state, the target shooting object is subjected to light supplement according to the depth information and the exposure information of the main preview picture, so that the situation that an underexposure area appears in a main image when the target shooting object is shot subsequently can be avoided, and the exposure effect of the main image can be improved during subsequent image acquisition.
As a possible implementation manner, referring to fig. 5, on the basis of the embodiment shown in fig. 4, step 208 specifically includes the following sub-steps:
step 301, determining a forward exposure compensation value required by the target shooting object according to the exposure information.
As a possible implementation manner, exposure values of a plurality of light measurement points arranged in a matrix in the main image may be determined according to the exposure information, then the exposure values of the plurality of light measurement points are averaged, a difference value is obtained by subtracting the average value from a preset threshold value c, and then the difference value is used as a forward exposure compensation value required by the target photographic object.
As another possible implementation, the distance of the target photographic subject from the main camera may be determined from the depth information of the target photographic subject in the main preview screen. Optionally, the corresponding relationship between different distances and the forward exposure compensation value may be pre-established, so that after the distance is determined, the corresponding relationship may be queried to obtain the forward exposure compensation value corresponding thereto.
And step 302, generating light distribution control parameters of the flash lamp according to the depth information and the forward exposure compensation value of the target shooting object in the main preview picture.
In the embodiment of the invention, the corresponding relation between different distances and the forward exposure compensation value can be established in advance, then the light distribution curve can be determined according to the corresponding relation between the distances and the forward exposure compensation value, and further the light distribution control parameters required by the flash lamp to generate the light distribution curve can be determined according to the light distribution curve.
As a possible implementation manner, the flash lamp is a plurality of lamp beads, and the light distribution control parameter may be a current value loaded by different lamp beads, or the light distribution control parameter may be a state of different lamp beads, where the state of the lamp bead includes an on state and an off state, which is not limited to this.
And 303, controlling a flash lamp to supplement light for the target shooting object according to the light distribution control parameters.
In the embodiment of the invention, after the light distribution control parameter of the flash lamp is determined, the flash lamp can be controlled to supplement light to the target shooting object, so that the condition that an under-exposure area appears in a main image when the target shooting object is shot subsequently is avoided, and the exposure effect of the main image is improved.
According to the imaging method, the forward exposure compensation value required by the target shooting object is determined according to the exposure information, the light distribution control parameter of the flash lamp is generated according to the depth information and the forward exposure compensation value of the target shooting object in the main preview picture, and the flash lamp is controlled to supplement light to the target shooting object according to the light distribution control parameter. Therefore, the situation that an underexposed area appears in the main image when the target shooting object is shot subsequently can be avoided, and the exposure effect of the main image is improved.
In order to implement the above embodiments, the present application also proposes an imaging apparatus.
Fig. 6 is a schematic structural diagram of an imaging device according to an embodiment of the present application.
As shown in fig. 6, the image forming apparatus includes: an acquisition module 601, a first determination module 602, a depth of field module 603, a first processing module 604, an exposure module 605, and a blurring module 606. Wherein the content of the first and second substances,
the acquiring module 601 is configured to acquire a main image acquired by the main camera and acquire an auxiliary image acquired by the auxiliary camera.
The first determining module 602 is configured to determine an exposure adjustment area of the main image and corresponding exposure compensation information according to exposure information obtained by photometry.
As a possible implementation manner, the first determining module 602 is specifically configured to determine, according to the exposure information, exposure values of a plurality of light measurement points arranged in a matrix in the main image; dividing the main image into a plurality of units according to the framing picture continuity of the main image; determining the exposure value of each unit according to the exposure value of the light measuring point contained in each unit; and if the exposure value of the unit is not in the preset value range, determining the unit as an exposure adjustment area.
The depth of field module 603 is configured to obtain depth information of the main image according to the main image and the secondary image.
The first processing module 604 is configured to determine a foreground region and a background region of the main image according to the depth information of the main image.
The exposure module 605 is configured to perform exposure adjustment on the main image according to the exposure compensation information corresponding to the exposure adjustment area at the overlapping portion of the foreground area and the exposure adjustment area.
As a possible implementation manner, the exposure module 605 is specifically configured to determine, in the foreground region, a portion overlapping each exposure adjustment region respectively; and in the overlapping part of each exposure adjustment area, adjusting one or more of brightness, contrast and color saturation according to the exposure compensation information corresponding to the exposure adjustment area.
The blurring module 606 is configured to perform blurring processing on the main image after exposure adjustment in the background area to obtain an imaged image.
Further, in a possible implementation manner of the embodiment of the present application, referring to fig. 7, on the basis of the embodiment shown in fig. 6, the imaging apparatus may further include: a control determination module 607, a second determination module 608, an acquisition determination module 609, and a second processing module 610. Wherein the content of the first and second substances,
the control determining module 607 is configured to control the main camera and/or the sub camera to perform photometry to obtain exposure information before acquiring the main image acquired by the main camera and acquiring the sub image acquired by the sub camera; and determining exposure parameters according to the exposure information.
And a second determining module 608, configured to control the main camera and the sub-camera to perform image acquisition according to the exposure parameter.
An obtaining and determining module 609, configured to obtain, before controlling the main camera and the sub-camera to perform image acquisition according to the exposure parameter, a main preview picture for previewing the target photographic object from the main camera, and a sub-preview picture for previewing the target photographic object from the sub-camera, respectively; and determining the depth information of the main preview picture according to the main preview picture and the auxiliary preview picture.
And the processing module 610 is configured to, when it is determined that the target photographic object is in an underexposure state according to the exposure information, perform light compensation on the target photographic object according to the depth information and the exposure information of the main preview screen.
As a possible implementation manner, the processing module 610 is specifically configured to determine a forward exposure compensation value required by the target photographic object according to the exposure information; generating a light distribution control parameter of a flash lamp according to the depth information and the forward exposure compensation value of the target shooting object in the main preview picture; and controlling a flash lamp to supplement light to the target shooting object according to the light distribution control parameters.
Optionally, the second processing module 610 is further configured to determine, according to the exposure information, that the target photographic object is in an overexposure state, and determine, according to the exposure information, a negative exposure compensation value required by the target photographic object; and shading the shot object according to the negative exposure compensation value.
It should be noted that the foregoing explanation of the embodiment of the imaging method is also applicable to the imaging device of this embodiment, and is not repeated here.
In the imaging apparatus of this embodiment, after the main image acquired by the main camera and the sub-image acquired by the sub-camera are acquired, the exposure adjustment area of the main image and the corresponding exposure compensation information are determined according to the exposure information obtained by photometry. And then according to the main image and the auxiliary image, acquiring the depth information of the main image, finally determining a foreground area and a background area of the main image according to the depth information of the main image, carrying out exposure adjustment on the main image according to exposure compensation information corresponding to the exposure adjustment area at the overlapping part of the foreground area and the exposure adjustment area, and carrying out blurring processing on the main image after the exposure adjustment in the background area to obtain an imaging image. In this embodiment, the exposure adjustment region and the corresponding exposure compensation information of the main image are determined by pre-measuring light, so that the exposure adjustment of the main image is performed according to the exposure compensation information corresponding to the exposure adjustment region, thereby effectively avoiding an overexposure region and an underexposure region from occurring in the imaging image and improving the exposure effect of the imaging image. Meanwhile, the auxiliary image shot by the auxiliary camera and the main image shot by the main camera are shot synchronously, so that the main image after exposure adjustment is subjected to subsequent blurring processing according to the corresponding auxiliary image, on one hand, the imaging effect of the imaging picture is improved, on the other hand, the accuracy of the depth information is improved, and the image processing effect is better.
In order to implement the foregoing embodiments, the present application further proposes a mobile terminal, and fig. 8 is a schematic structural diagram of a terminal device according to another embodiment of the present application, and as shown in fig. 8, the terminal device 1000 includes: a housing 1100 and a primary camera 1112, a secondary camera 1113, a memory 1114, and a processor 1115 located within the housing 1100.
Wherein the memory 1114 stores executable program code; the processor 1115 executes a program corresponding to the executable program code by reading the executable program code stored in the memory 1114 for performing the imaging method as described in the aforementioned method embodiments.
In order to implement the above embodiments, the present application also proposes a computer-readable storage medium having stored thereon a computer program which, when executed by a processor of a mobile terminal, implements the imaging method as proposed in the foregoing embodiments.
The mobile terminal may further include an Image Processing circuit, which may be implemented by hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 9 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 9, for convenience of explanation, only aspects of the image processing technique related to the embodiments of the present application are shown.
As shown in fig. 9, the image processing circuit includes an ISP processor 940 and a control logic 950. The image data captured by the imaging device 910 is first processed by the ISP processor 940, and the ISP processor 940 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of the imaging device 910. Imaging device 910 may specifically include two cameras, each of which may include one or more lenses 912 and an image sensor 914. Image sensor 914 may include an array of color filters (e.g., Bayer filters), and image sensor 914 may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 914 and provide a set of raw image data that may be processed by ISP processor 940. The sensor 920 may provide the raw image data to the ISP processor 940 based on the type of interface of the sensor 920. The sensor 920 interface may utilize an SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
The ISP processor 940 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 940 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 940 may also receive pixel data from image memory 930. For example, raw pixel data is sent from the sensor 920 interface to the image memory 930, and the raw pixel data in the image memory 930 is then provided to the ISP processor 940 for processing. The image Memory 930 may be a part of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from the sensor 920 interface or from the image memory 930, the ISP processor 940 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 930 for additional processing before being displayed. ISP processor 940 receives processed data from image memory 930 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The processed image data may be output to a display 970 for viewing by a user and/or further processing by a Graphics Processing Unit (GPU). Further, the output of ISP processor 940 may also be sent to image memory 930 and display 970 may read image data from image memory 930. In one embodiment, image memory 930 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 940 may be transmitted to an encoder/decoder 960 for encoding/decoding the image data. The encoded image data may be saved and decompressed before being displayed on a display 970 device. The encoder/decoder 960 may be implemented by a CPU or GPU or coprocessor.
The statistical data determined by the ISP processor 940 may be transmitted to the control logic 950 unit. For example, the statistical data may include image sensor 914 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 912 shading correction, and the like. The control logic 950 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of the imaging device 910 and, in turn, control parameters based on the received statistical data. For example, the control parameters may include sensor 920 control parameters (e.g., gain, integration time for exposure control), camera flash control parameters, lens 912 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 912 shading correction parameters.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (10)

1. An imaging method, characterized in that the method comprises the steps of:
acquiring a main image acquired by a main camera and acquiring an auxiliary image acquired by an auxiliary camera; determining the main camera and the auxiliary camera according to the ambient brightness; when the ambient brightness is not higher than the threshold brightness, a high-sensitivity camera is used as the main camera, a high-resolution camera is used as the auxiliary camera, and when the ambient brightness is higher than the threshold brightness, the high-resolution camera is used as the main camera, and a high-sensitivity camera is used as the auxiliary camera;
determining an exposure adjustment area of the main image and corresponding exposure compensation information according to exposure information obtained by photometry; dividing the main image into a plurality of units according to the continuity of the framing picture of the main image, determining the exposure value of each unit according to the exposure information, and determining the exposure adjustment area and the corresponding exposure compensation information of the main image according to the exposure value of each unit;
acquiring depth information of the main image according to the main image and the auxiliary image;
determining a foreground area and a background area of the main image according to the depth information of the main image;
carrying out exposure adjustment on the main image according to exposure compensation information corresponding to each exposure adjustment area at the overlapped part of the foreground area and each exposure adjustment area;
and blurring the main image after exposure adjustment in the background area to obtain an imaging image.
2. The imaging method according to claim 1, wherein before the acquiring the main image acquired by the main camera and the acquiring the sub-image acquired by the sub-camera, further comprising:
controlling the main camera and/or the auxiliary camera to perform photometry to obtain the exposure information;
determining an exposure parameter according to the exposure information;
and controlling the main camera and the auxiliary camera to acquire images according to the exposure parameters.
3. The imaging method according to claim 2, wherein before controlling the main camera and the sub camera to perform image acquisition according to the exposure parameter, the method further comprises:
respectively acquiring a main preview picture for previewing a target shooting object from the main camera and acquiring a secondary preview picture for previewing the target shooting object from the secondary camera;
determining the depth information of the main preview picture according to the main preview picture and the auxiliary preview picture;
and if the target shooting object is determined to be in an underexposure state according to the exposure information, supplementing light to the target shooting object according to the depth information of the main preview picture and the exposure information.
4. The imaging method according to claim 3, wherein the supplementary lighting for the target photographic subject according to the depth information and the exposure information of the main preview screen includes:
determining a forward exposure compensation value required by the target shooting object according to the exposure information;
generating a light distribution control parameter of a flash lamp according to the depth information of the target shooting object in the main preview picture and the forward exposure compensation value;
and controlling the flash lamp to supplement light to the target shooting object according to the light distribution control parameters.
5. The imaging method according to claim 3, wherein before controlling the main camera and the sub camera to perform image acquisition according to the exposure parameters, the method comprises:
if the target shooting object is determined to be in an overexposure state according to the exposure information, determining a negative exposure compensation value required by the target shooting object according to the exposure information;
and shading the target shooting object according to the negative exposure compensation value.
6. The imaging method according to any one of claims 1 to 5, wherein the exposure adjustment area is plural, and the performing exposure adjustment on the main image based on exposure compensation information corresponding to each exposure adjustment area at an overlapping portion of the foreground area and each exposure adjustment area includes:
respectively determining the overlapped part of each exposure adjusting area in the foreground area;
and in the overlapping part of each exposure adjusting area, adjusting one or more of brightness, contrast and color saturation according to the exposure compensation information corresponding to the exposure adjusting area.
7. The imaging method according to any one of claims 1 to 5, wherein determining an exposure adjustment area of the main image and corresponding exposure compensation information based on exposure information obtained by photometry includes:
determining exposure values of a plurality of light measuring points which are arranged in a matrix in the main image according to the exposure information;
dividing the main image into a plurality of units according to the framing picture continuity of the main image;
determining the exposure value of each unit according to the exposure value of the light measuring point contained in each unit;
and if the exposure value of the unit is not in the preset value range, determining the unit as an exposure adjustment area.
8. An image forming apparatus, comprising:
the acquisition module is used for acquiring a main image acquired by the main camera and acquiring an auxiliary image acquired by the auxiliary camera; determining the main camera and the auxiliary camera according to the ambient brightness; when the ambient brightness is not higher than the threshold brightness, a high-sensitivity camera is used as the main camera, a high-resolution camera is used as the auxiliary camera, and when the ambient brightness is higher than the threshold brightness, the high-resolution camera is used as the main camera, and a high-sensitivity camera is used as the auxiliary camera;
the first determining module is used for determining an exposure adjusting area of the main image and corresponding exposure compensation information according to exposure information obtained by photometry; dividing the main image into a plurality of units according to the continuity of the framing picture of the main image, determining the exposure value of each unit according to the exposure information, and determining the exposure adjustment area and the corresponding exposure compensation information of the main image according to the exposure value of each unit;
the depth-of-field module is used for acquiring the depth information of the main image according to the main image and the auxiliary image;
the first processing module is used for determining a foreground area and a background area of the main image according to the depth information of the main image;
the exposure module is used for carrying out exposure adjustment on the main image at the overlapped part of the foreground area and each exposure adjustment area according to the exposure compensation information corresponding to each exposure adjustment area;
and the blurring module is used for blurring the main image after exposure adjustment in the background area to obtain an imaging image.
9. A mobile terminal, comprising: memory, processor and computer program stored on the memory and executable on the processor, which when executed by the processor implements the imaging method as claimed in any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the imaging method as set forth in any one of claims 1-7.
CN201711243972.3A 2017-11-30 2017-11-30 Imaging method, imaging device, mobile terminal and storage medium Active CN107846556B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711243972.3A CN107846556B (en) 2017-11-30 2017-11-30 Imaging method, imaging device, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711243972.3A CN107846556B (en) 2017-11-30 2017-11-30 Imaging method, imaging device, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN107846556A CN107846556A (en) 2018-03-27
CN107846556B true CN107846556B (en) 2020-01-10

Family

ID=61663378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711243972.3A Active CN107846556B (en) 2017-11-30 2017-11-30 Imaging method, imaging device, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN107846556B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108629745B (en) * 2018-04-12 2021-01-19 Oppo广东移动通信有限公司 Image processing method and device based on structured light and mobile terminal
CN110830728A (en) * 2018-08-13 2020-02-21 浙江宇视科技有限公司 Exposure adjusting method and device
CN110853127A (en) * 2018-08-20 2020-02-28 浙江宇视科技有限公司 Image processing method, device and equipment
CN109862262A (en) * 2019-01-02 2019-06-07 上海闻泰电子科技有限公司 Image weakening method, device, terminal and storage medium
US11503216B2 (en) 2019-04-22 2022-11-15 Canon Kabushiki Kaisha Image capturing apparatus, method of controlling the same, and storage medium for controlling exposure
CN111294505B (en) * 2019-07-19 2021-05-04 展讯通信(上海)有限公司 Image processing method and device
CN113805824B (en) 2020-06-16 2024-02-09 京东方科技集团股份有限公司 Electronic device and method for displaying image on display apparatus
CN111768434A (en) * 2020-06-29 2020-10-13 Oppo广东移动通信有限公司 Disparity map acquisition method and device, electronic equipment and storage medium
CN114257712A (en) * 2020-09-25 2022-03-29 华为技术有限公司 Method and device for controlling light supplementing time of camera module

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101621629B (en) * 2008-06-30 2011-09-14 睿致科技股份有限公司 Method of automatic exposure
CN103945210B (en) * 2014-05-09 2015-08-05 长江水利委员会长江科学院 A kind of multi-cam image pickup method realizing shallow Deep Canvas
CN106603933A (en) * 2016-12-16 2017-04-26 中新智擎有限公司 Exposure method and apparatus
CN106851123B (en) * 2017-03-09 2020-12-22 Oppo广东移动通信有限公司 Exposure control method, exposure control device and electronic device
CN106993112B (en) * 2017-03-09 2020-01-10 Oppo广东移动通信有限公司 Background blurring method and device based on depth of field and electronic device

Also Published As

Publication number Publication date
CN107846556A (en) 2018-03-27

Similar Documents

Publication Publication Date Title
CN107846556B (en) Imaging method, imaging device, mobile terminal and storage medium
CN107948519B (en) Image processing method, device and equipment
CN108989700B (en) Imaging control method, imaging control device, electronic device, and computer-readable storage medium
KR102279436B1 (en) Image processing methods, devices and devices
KR102376901B1 (en) Imaging control method and imaging device
CN108683862B (en) Imaging control method, imaging control device, electronic equipment and computer-readable storage medium
JP7145208B2 (en) Method and Apparatus and Storage Medium for Dual Camera Based Imaging
CN109788207B (en) Image synthesis method and device, electronic equipment and readable storage medium
CN108712608B (en) Terminal equipment shooting method and device
CN108024054B (en) Image processing method, device, equipment and storage medium
CN107948538B (en) Imaging method, imaging device, mobile terminal and storage medium
CN108337446B (en) High dynamic range image acquisition method, device and equipment based on double cameras
CN110213494B (en) Photographing method and device, electronic equipment and computer readable storage medium
CN108156369B (en) Image processing method and device
CN108616689B (en) Portrait-based high dynamic range image acquisition method, device and equipment
JP6999802B2 (en) Methods and equipment for double camera-based imaging
CN108683863B (en) Imaging control method, imaging control device, electronic equipment and readable storage medium
CN109040607B (en) Imaging control method, imaging control device, electronic device and computer-readable storage medium
CN110166705B (en) High dynamic range HDR image generation method and device, electronic equipment and computer readable storage medium
CN108024057B (en) Background blurring processing method, device and equipment
CN110213498B (en) Image generation method and device, electronic equipment and computer readable storage medium
CN107872631B (en) Image shooting method and device based on double cameras and mobile terminal
US11601600B2 (en) Control method and electronic device
CN110290325B (en) Image processing method, image processing device, storage medium and electronic equipment
CN109756680B (en) Image synthesis method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: OPPO Guangdong Mobile Communications Co., Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: Guangdong Opel Mobile Communications Co., Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant