CN107071291B - Image processing method and device and electronic equipment - Google Patents

Image processing method and device and electronic equipment Download PDF

Info

Publication number
CN107071291B
CN107071291B CN201611232344.0A CN201611232344A CN107071291B CN 107071291 B CN107071291 B CN 107071291B CN 201611232344 A CN201611232344 A CN 201611232344A CN 107071291 B CN107071291 B CN 107071291B
Authority
CN
China
Prior art keywords
sensor
exposure parameter
exposure
photosensitive area
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611232344.0A
Other languages
Chinese (zh)
Other versions
CN107071291A (en
Inventor
秦刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Black Shark Technology Co Ltd
Original Assignee
Nanchang Black Shark Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Black Shark Technology Co Ltd filed Critical Nanchang Black Shark Technology Co Ltd
Priority to CN201611232344.0A priority Critical patent/CN107071291B/en
Publication of CN107071291A publication Critical patent/CN107071291A/en
Application granted granted Critical
Publication of CN107071291B publication Critical patent/CN107071291B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides an image processing method, an image processing device and electronic equipment, wherein the image processing method comprises the following steps: acquiring a preview image generated by a first sensor; determining a first exposure parameter of the first sensor and a second exposure parameter of the second sensor according to the brightness value of the preview image; wherein a field angle of the first sensor is greater than a field angle of the second sensor; and transmitting the first exposure parameter to the first sensor, and transmitting the second exposure parameter to the second sensor, so that the first sensor performs exposure according to the first exposure parameter, and the second sensor performs exposure according to the second exposure parameter. The invention can improve the quality of the shot image.

Description

Image processing method and device and electronic equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, and an electronic device.
Background
With the continuous development of electronic technology, more and more electronic devices are equipped with cameras to capture images.
In order to improve the quality of the shot images, a plurality of cameras can be configured in the electronic equipment so as to obtain the images simultaneously. For an electronic device configured with two cameras, two cameras with different fields of View (FOV) are usually selected. The cameras in different fields of view, the light that the target object sent is different through the light that the camera lens incides to the sensor, if adopt same exposure parameter of predetermineeing to expose the sensor of the camera in this different fields of view, the luminance of the image that easily makes these two cameras obtain respectively appears in great difference.
The two images with the larger difference in brightness are subjected to image synthesis, so that image synthesis failure is easily caused, and the quality of the obtained target image is poor.
Disclosure of Invention
The invention provides an image processing method, an image processing device and electronic equipment, and aims to improve image quality.
The invention provides an image processing method, which comprises the following steps:
acquiring a preview image generated by a first sensor;
determining a first exposure parameter of the first sensor and a second exposure parameter of the second sensor according to the brightness value of the preview image; wherein a field angle of the first sensor is greater than a field angle of the second sensor;
and transmitting the first exposure parameter to the first sensor, and transmitting the second exposure parameter to the second sensor, so that the first sensor performs exposure according to the first exposure parameter, and the second sensor performs exposure according to the second exposure parameter.
The present invention also provides an image processing apparatus comprising:
the acquisition module is used for acquiring a preview image generated by the first sensor;
the determining module is used for determining a first exposure parameter of the first sensor and a second exposure parameter of the second sensor according to the brightness value of the preview image;
the transmission module is used for transmitting the first exposure parameter to the first sensor and transmitting the second exposure parameter to the second sensor so that the first sensor performs exposure according to the first exposure parameter and the second sensor performs exposure according to the second exposure parameter; wherein a field angle of the first sensor is greater than a field angle of the second sensor.
The present invention also provides an electronic device comprising: the system comprises an application processor AP, a data signal processor DSP, a first sensor and a second sensor; the AP is respectively connected with the first sensor and the second sensor; the DSP is respectively connected with the first sensor and the second sensor, and is also connected with the AP; wherein a field angle of the first sensor is greater than a field angle of the second sensor;
the AP is used for acquiring a preview image generated by the first sensor; determining a first exposure parameter of the first sensor and a second exposure parameter of the second sensor according to the brightness value of the preview image; transmitting the first exposure parameter to the first sensor and the second exposure parameter to the second sensor;
the first sensor is used for generating a first image according to the first exposure parameter and transmitting the first image to the DSP;
the second sensor is used for generating a second image according to the second exposure parameter and transmitting the second image to the DSP;
the DSP is used for carrying out image synthesis on the first image and the second image to obtain a target image.
According to the image processing method, the image processing device and the electronic equipment, the preview image generated by the first sensor is obtained, the first exposure parameter of the first sensor and the second exposure parameter of the second sensor are determined according to the brightness value of the preview image, the first exposure parameter is transmitted to the first sensor, the second exposure parameter is transmitted to the second sensor, the first sensor is exposed according to the first exposure parameter, and the second sensor is exposed according to the second exposure parameter. Because the first exposure parameter and the second exposure parameter are determined according to the brightness value of the preview image, the brightness difference between the image obtained by the first sensor through exposure according to the first exposure parameter and the image obtained by the second sensor through exposure according to the second exposure parameter can be within a preset range, namely the brightness of the images can be kept consistent, so that the images obtained by the two sensors are subjected to image synthesis, the success rate of the image synthesis is high, and the quality of a target image obtained by the image synthesis is effectively improved.
Drawings
Fig. 1 is a flowchart of an image processing method of an electronic device according to the present invention;
FIG. 2 is a flowchart of a method for determining exposure parameters in an image processing method according to the present invention;
FIG. 3A is a flowchart illustrating a method for determining overlapping photosensitive regions in an image processing method according to the present invention;
FIG. 3B is a schematic diagram illustrating the positions of the photosensitive regions of the first sensor and the second sensor according to the present invention;
FIG. 3C is a schematic bottom cross-sectional view of a light-sensitive area of a first sensor and a light-sensitive area of a second sensor according to the present invention;
FIG. 4 is a flowchart illustrating a method for entering an optical zoom mode in an image processing method according to the present invention;
FIG. 5 is a flowchart of a method for determining exposure parameters in an image processing method according to the present invention;
FIG. 6 is a flow chart of another image processing method provided by the present invention;
FIG. 7 is a schematic structural diagram of an image processing apparatus according to the present invention;
FIG. 8 is a schematic structural diagram of another image processing apparatus according to the present invention;
fig. 9 is a schematic structural diagram of an electronic device provided in the present invention;
fig. 10 is a schematic structural diagram of another electronic device provided in the present invention.
Detailed Description
The image processing method of each electronic device provided by the invention can be executed by an image processing device, and the image processing device can be integrated in any electronic device provided with a camera in a software and/or hardware mode. The electronic equipment can be any electronic equipment provided with a camera, such as a mobile phone, a tablet computer, intelligent wearable equipment and the like.
Before the image processing method is introduced, the electronic device will be briefly described. The electronic device may include: an Application Processor (AP), a Digital Signal Processor (DSP), and a sensor (sensor). The electronic device may include a plurality of cameras, each camera including a lens and an image sensor. The sensor may be an image sensor in any camera of the electronic device. The sensor may be any type of image sensor, such as a Charge Coupled Device (CCD) image sensor, a Metal-Oxide Semiconductor (CMOS) image sensor, or the like.
The following is described in connection with a number of examples.
Fig. 1 is a flowchart of an image processing method of an electronic device according to the present invention. The image processing method may be performed by an electronic device. As shown in fig. 1, the method may include:
s101, acquiring a preview image generated by the first sensor.
The first sensor may be a sensor corresponding to a camera of the electronic device, where an angle of view is larger than a preset angle of view. In one example, the first sensor may be a sensor of a Wide angle (Wide) camera.
After the first sensor generates a preview image, the generated preview image may be transmitted to the DSP, and in S101, the AP may acquire the preview image generated by the first sensor from the DSP. The first sensor may transmit the preview image generated by the first sensor to the DSP through a Mobile Industry Processor Interface (MIPI) bus between the first sensor and the DSP, and the DSP may transmit the preview image generated by the first sensor to the AP through a MIPI bus between the DSP and the AP.
S102, determining a first exposure parameter of the first sensor and a second exposure parameter of the second sensor according to the brightness value of the preview image; wherein the first sensor has a larger field of view than the second sensor.
Specifically, the second sensor may be an image sensor of a camera in the electronic device having a smaller angle of view than a preset angle of view. In one example, the second sensor may be an image sensor of a Tele (Tele) camera. Wherein the first sensor may also be referred to as a primary sensor and the second sensor may also be referred to as a secondary sensor.
In S102, the first exposure parameter and the second exposure parameter may be determined by respectively using the exposure algorithm corresponding to each sensor according to the brightness value of the preview image. In one example, the exposure algorithm corresponding to the first sensor is different from the exposure algorithm corresponding to the second sensor, so that the first exposure parameter and the second exposure parameter are different parameters respectively.
Wherein the first exposure parameter and the second exposure parameter may include at least one of: exposure time, shutter speed, sensitivity, etc.
S103, transmitting the first exposure parameter to the first sensor, and transmitting the second exposure parameter to the second sensor, so that the first sensor performs exposure according to the first exposure parameter, and the second sensor performs exposure according to the second exposure parameter.
In S103, the AP may transmit the first exposure parameter to the first sensor through a bus between the AP and the first sensor, and transmit the second exposure parameter to the second sensor through a bus between the AP and the second sensor. After receiving the first exposure parameter, the first sensor may perform exposure according to the first exposure parameter to generate a first image. After receiving the second exposure parameter, the second sensor may perform exposure according to the second exposure parameter to generate a second image.
After the first sensor generates the first image, the first image can be transmitted to the DSP through a connecting bus between the first sensor and the DSP; after the second sensor generates the second image, the second image can be transmitted to the DSP through a connection bus between the second sensor and the DSP. After receiving the first image and the second image, the DSP can perform image synthesis on the first image and the second image to obtain a target image. Wherein the first image may also be referred to as a first Raw (Raw) image and the second image may be referred to as a second Raw image with respect to the DSP. Since the first exposure parameter and the second exposure parameter are determined according to the brightness value of the preview image, the brightness of the first image obtained according to the first exposure parameter and the brightness of the second image obtained according to the second exposure parameter can be kept consistent, for example, the brightness difference is within a preset range.
According to the image processing method provided by the invention, a preview image generated by a first sensor is obtained, a first exposure parameter of the first sensor and a second exposure parameter of a second sensor are determined according to the brightness value of the preview image, the first exposure parameter is transmitted to the first sensor, the second exposure parameter is transmitted to the second sensor, so that the first sensor performs exposure according to the first exposure parameter, and the second sensor performs exposure according to the second exposure parameter. Because the first exposure parameter and the second exposure parameter are determined according to the brightness value of the preview image, the brightness difference between the image obtained by the first sensor through exposure according to the first exposure parameter and the image obtained by the second sensor through exposure according to the second exposure parameter can be within a preset range, namely the brightness of the images can be kept consistent, so that the images obtained by the two sensors are subjected to image synthesis, the success rate of the image synthesis is high, and the quality of a target image obtained by the image synthesis is effectively improved.
In addition, in the method, since the first exposure parameter and the second exposure parameter are both determined according to the brightness value of the preview image, the first exposure parameter is transmitted to the first sensor, the second exposure parameter is transmitted to the second sensor, and synchronous exposure of the first sensor and the second sensor can be realized.
Optionally, the present invention further provides an image processing method. Fig. 2 is a flowchart of a method for determining an exposure parameter in an image processing method according to the present invention. As shown in fig. 2, determining the first exposure parameter of the first sensor and the second exposure parameter of the second sensor according to the brightness value of the preview image in S102 may include:
s201, determining a photosensitive area overlapped by the photosensitive area of the first sensor and the photosensitive area of the second sensor.
The overlapping photosensitive areas may be central areas of the photosensitive areas of the first sensor mapped by the second sensor. In S201, the overlapped photosensitive area may be determined according to the size of the photosensitive area of the first sensor, the field angle of the first sensor, and the field angle of the second sensor.
And S202, determining the brightness value of the overlapped photosensitive area in the preview image.
In order to ensure that the brightness value used for determining the exposure parameter is related to both the first sensor and the second sensor, brightness statistics may be performed on the overlapped photosensitive areas in the preview image in S202, so as to obtain the brightness value of the overlapped photosensitive areas in the preview image.
S203, determining the first exposure parameter and the second exposure parameter according to the brightness value of the overlapped photosensitive area.
In the method, because the brightness values adopted by the first exposure parameter and the second exposure parameter are determined to be the brightness values of the overlapped photosensitive areas in the preview image, the brightness difference between the image obtained by exposing the first sensor according to the first exposure parameter and the image obtained by exposing the second sensor according to the second exposure parameter can be made as small as possible, the brightness consistency of the images obtained by the first sensor and the second sensor can be effectively ensured, the success rate of image synthesis of the images obtained by the two sensors can be improved, and the quality of the target image obtained by image synthesis can be effectively improved.
Optionally, the present invention further provides an image processing method. Fig. 3A is a flowchart of a method for determining overlapping photosensitive areas in an image processing method according to the present invention. Fig. 3B is a schematic diagram illustrating positions of a photosensitive region of a first sensor and a photosensitive region of a second sensor according to the present invention. Fig. 3C is a schematic bottom cross-sectional view of a photosensitive area of a first sensor and a photosensitive area of a second sensor according to the present invention. As shown in fig. 3A, determining the photosensitive area of the first sensor overlapping the photosensitive area of the second sensor in S201 may include:
s301, determining the width of the overlapped photosensitive area according to the field angle of the first sensor, the field angle of the second sensor and the width of the photosensitive area of the first sensor in the following manner (1).
tan (α/2)/tan (β/2) ═ l/w equation (1)
Wherein, α is the angle of view of the first sensor, β is the angle of view of the second sensor; l is the width of the overlapping photosensitive area; w is the width of the photosensitive area of the first sensor.
As shown in fig. 3B and 3C, taking the first sensor and the second sensor in a parallel overlapping manner as an example, since the angle of view of the first sensor is α, the maximum included angle between two positions on the same horizontal line where the light emitted from the object is emitted to the first sensor may be α; since the angle of view of the second sensor is β, the maximum included angle between two positions on the same horizontal line, where the light emitted from the object is emitted to the second sensor, may be β. The field angle α of the first sensor is larger than the field angle β of the second sensor, and thus, the photosensitive area of the first sensor may be larger than that of the second sensor.
The overlapped photosensitive area may be an area formed by two positions of the light emitted from the object to the first sensor on the same horizontal line, the two positions being in the photosensitive area of the first sensor, when an included angle between the two positions is an angle of view β of the second sensor.
The width l obtained using the above equation (1) may be the width of the overlapping photosensitive regions shown in fig. 3B and 3C.
And S302, determining the height of the overlapped photosensitive area according to the width of the overlapped photosensitive area and a preset aspect ratio.
The predetermined aspect ratio may be a ratio of a width to a height of the photosensitive region. The predetermined aspect ratio may be any ratio of 4/3, 16/9, etc. The ratio of the width of the overlapping photosensitive region to the height of the overlapping photosensitive region may be the predetermined aspect ratio.
Taking the preset aspect ratio of 4/3 as an example, the height of the overlapped photosensitive region may be determined in S302 by the product of the width of the overlapped photosensitive region and the inverse of the preset aspect ratio 4/3, that is, 3/4.
And S303, determining the overlapped photosensitive area according to the width of the overlapped photosensitive area and the height of the overlapped photosensitive area.
The overlapping photosensitive regions may be rectangular photosensitive regions formed by the width of the overlapping photosensitive regions and the height of the overlapping photosensitive regions.
In the image processing method, the width of the overlapped photosensitive area is determined according to the field angle of the first sensor, the field angle of the second sensor and the width of the photosensitive area of the first sensor, the height of the overlapped photosensitive area is determined according to the width of the overlapped photosensitive area and a preset aspect ratio, and then the overlapped photosensitive area is determined according to the width of the overlapped photosensitive area and the height of the overlapped photosensitive area, so that the determination of the overlapped photosensitive area can be more accurate.
Optionally, the present invention further provides an image processing method. Fig. 4 is a flowchart of a method for entering an optical zoom mode in an image processing method according to the present invention. As shown in fig. 4, before acquiring the preview image generated by the first sensor in S101 as described above, the method may further include:
s401, receiving an input zoom command.
Specifically, in S401, a zoom instruction input by a user through at least one of voice, touch click operation, pressing a physical key, and the like may be received. The zoom instruction may be an instruction input by a user according to a selected optical zoom mode.
S402, controlling the first camera and the second camera to enter an optical zoom mode according to the zoom instruction; the first camera is a camera corresponding to the first sensor, and the second camera is a camera corresponding to the second sensor.
When the first camera and the second camera enter an optical zoom mode, that is, the camera of the electronic device enters the zoom mode, the dual-channel synchronous exposure is triggered, that is, the preview image of the first sensor is triggered to be acquired, and then the image processing method described in any one of fig. 1 to 3A is executed.
In the preview image generated by the second sensor, in the invention, the AP does not need to acquire an image from the second sensor, or the AP can acquire an image from the second sensor but does not calculate the brightness value of the image, or the AP can acquire an image from the second sensor and calculate the brightness value of the image, but only adopts the brightness value of the preview image obtained from the first sensor and discards the brightness value of the image obtained from the second sensor during actual use. Of course, the image obtained by the second sensor may be processed in other manners in the present invention, which is only an example, and the present invention is not limited thereto.
Optionally, the present invention further provides an image processing method. Fig. 5 is a flowchart of a method for determining exposure parameters in an image processing method according to the present invention. As shown in fig. 5, determining the first exposure parameter and the second exposure parameter according to the brightness value of the overlapped photosensitive area in S203 as shown above may include:
s501, determining the first exposure parameter according to the brightness value of the preview image and a preset first exposure algorithm; wherein, the first exposure algorithm is the exposure algorithm corresponding to the first sensor.
S502, determining a second exposure parameter according to the brightness value of the preview image and a preset second exposure algorithm; and the second exposure algorithm is an exposure algorithm corresponding to the second sensor.
Specifically, the first Exposure algorithm may be an Auto Exposure (AE) algorithm corresponding to the first sensor. The second exposure algorithm may be an AE algorithm corresponding to the second sensor. The first exposure algorithm and the second exposure algorithm may be different exposure algorithms. The AP may obtain different exposure parameters, i.e., the first exposure parameter and the second exposure parameter, by using different exposure algorithms according to the brightness value of the preview image.
Optionally, the transmitting the first exposure parameter to the first sensor, the transmitting the second exposure parameter to the second sensor, the transmitting the first exposure parameter to the first sensor, and the transmitting the second exposure parameter to the second sensor in S103 may include:
the first exposure parameter is transmitted to the first sensor through an Inter-Integrated Circuit (I2C) bus with the first sensor, and the second exposure parameter is transmitted to the second sensor through an I2C bus with the second sensor.
That is, the AP and the first sensor may be connected via an I2C bus, and the AP and the second sensor may be connected via an I2C bus. Therefore, the AP may access the first sensor through the I2C bus with the first sensor, i.e. send the first exposure parameter to the first sensor; the AP may access the second sensor through an I2C bus with the second sensor, i.e., send the second exposure parameter to the second sensor.
Optionally, the first sensor is an image sensor of a wide-angle camera; the second sensor is an image sensor of a far-focus camera.
The invention also provides an image processing method. This image processing method shown below can explain the image processing method described in any of fig. 1 to 5 above by way of a specific example. Fig. 6 is a flowchart of another image processing method provided by the present invention. As shown in fig. 6, the method may include:
s601, the AP of the electronic equipment receives an input zoom command.
And S602, the AP controls the wide-angle camera and the telephoto camera of the electronic equipment to enter an optical zoom mode according to the zoom instruction.
And S603, the first sensor of the wide-angle camera sends the preview image generated by the first sensor to the DSP through an MIPI bus between the first sensor and the DSP of the electronic equipment.
And S604, the DSP sends the preview image generated by the first sensor to the AP through an MIPI bus between the DSP and the AP.
S605 and AP determine a photosensitive region where the photosensitive region of the first sensor overlaps with the photosensitive region of the second sensor of the telephoto camera.
S606, AP determines the brightness value of the overlapped photosensitive area in the preview image.
S607, AP determines the first exposure parameter and the second exposure parameter according to the brightness value of the overlapped photosensitive area.
S608, the AP transmits the first exposure parameter to the first sensor through an I2C bus between the AP and the first sensor, and transmits the second exposure parameter to the second sensor through an I2C bus between the AP and the second sensor.
And S609, exposing by the first sensor according to the first exposure parameter to obtain a first image.
And S610, exposing by the second sensor according to the second exposure parameter to obtain a second image.
And S611, the first sensor transmits the first image to the DSP through an MIPI bus between the first sensor and the DSP, and the second sensor transmits the second image to the DSP through an MIPI bus between the second sensor and the DSP.
And S612, the DSP performs image synthesis according to the first image and the second image to obtain a target image, and transmits the target image to the AP through an MIPI bus between the target image and the AP.
S613, the AP transmits the target image to a display of the electronic device to display the target image.
In the image processing method, because the first exposure parameter and the second exposure parameter are determined according to the brightness value of the overlapped photosensitive area of the first sensor and the second sensor in the preview image, the brightness difference between the image obtained by exposing the first sensor according to the first exposure parameter and the image obtained by exposing the second sensor according to the second exposure parameter can be within a preset range, namely the brightness of the images can be kept consistent, so that the images obtained by the two sensors are subjected to image synthesis, the success rate of the image synthesis is high, and the quality of a target image obtained by the image synthesis is effectively improved. Moreover, since the first exposure parameter and the second exposure parameter are both determined according to the brightness value of the preview image, the first exposure parameter is transmitted to the first sensor, the second exposure parameter is transmitted to the second sensor, and synchronous exposure of the first sensor and the second sensor can be realized.
The invention also provides an image processing device. The image processing apparatus may be integrated in the electronic device by means of software and/or hardware. Fig. 7 is a schematic structural diagram of an image processing apparatus according to the present invention. As shown in fig. 7, the image processing apparatus 700 may include:
an obtaining module 701, configured to obtain a preview image generated by the first sensor.
A determining module 702, configured to determine a first exposure parameter of the first sensor and a second exposure parameter of the second sensor according to the brightness value of the preview image.
A transmission module 703, configured to transmit the first exposure parameter to the first sensor and transmit the second exposure parameter to the second sensor, so that the first sensor performs exposure according to the first exposure parameter and the second sensor performs exposure according to the second exposure parameter; wherein the first sensor has a larger field of view than the second sensor.
Optionally, the determining module 702 is specifically configured to determine a photosensitive area where the photosensitive area of the first sensor overlaps with the photosensitive area of the second sensor; determining a brightness value of the overlapping photosensitive area in the preview image; the first exposure parameter and the second exposure parameter are determined according to the brightness value of the overlapped photosensitive area.
Optionally, the determining module 702 is specifically configured to determine, according to the field angle of the first sensor, the field angle of the second sensor, and the width of the photosensitive area of the first sensor, the width of the overlapped photosensitive area by using the following method (1); determining the height of the overlapped photosensitive area according to the width of the overlapped photosensitive area and a preset aspect ratio; and determining the overlapping photosensitive area based on the width of the overlapping photosensitive area and the height of the overlapping photosensitive area.
tan (α/2)/tan (β/2) ═ l/w equation (1)
Wherein, α is the angle of view of the first sensor, β is the angle of view of the second sensor; l is the width of the overlapping photosensitive area; w is the width of the photosensitive area of the first sensor.
Optionally, fig. 8 is a schematic structural diagram of another image processing apparatus provided in the present invention. As shown in fig. 8, the image processing apparatus 700 may further include:
an input module 704, configured to receive an input zoom instruction before the obtaining module 701 obtains the preview image generated by the first sensor.
A control module 705, configured to control the first camera and the second camera to enter an optical zoom mode according to the zoom instruction; the first camera is a camera corresponding to the first sensor, and the second camera is a camera corresponding to the second sensor.
Optionally, the determining module 702 is specifically configured to determine the first exposure parameter according to the brightness value of the preview image and a preset first exposure algorithm; determining the second exposure parameter according to the brightness value of the preview image and a preset second exposure algorithm; wherein, the first exposure algorithm is an exposure algorithm corresponding to the first sensor; the second exposure algorithm is an exposure algorithm corresponding to the second sensor.
Optionally, the transmitting module 703 is specifically configured to transmit the first exposure parameter to the first sensor through an I2C bus connected to the first sensor; the second exposure parameter is transmitted to the second sensor via an I2C bus with the second sensor.
Optionally, the first sensor is an image sensor of a wide-angle camera; the second sensor is an image sensor of a far-focus camera.
The image processing apparatus provided by the present invention can execute the image processing method described in any one of fig. 1 to fig. 6, and the specific implementation process and beneficial effects thereof are referred to above, and are not described herein again.
The invention also provides electronic equipment. Fig. 9 is a schematic structural diagram of an electronic device provided in the present invention. As shown in fig. 9, the electronic device may include: AP901, DSP902, first sensor 903, and second sensor 904. The AP901 is connected to a first sensor 903 and a second sensor 904, respectively. The DSP902 is respectively connected with the first sensor 903 and the second sensor 904, and the DSP902 is also connected with the AP 902; wherein the field angle of the first sensor 903 is greater than the field angle of the second sensor 904.
The AP901 is configured to acquire a preview image generated by the first sensor 903; determining a first exposure parameter of the first sensor 903 and a second exposure parameter of the second sensor 904 according to the brightness value of the preview image; the first exposure parameter is transmitted to the first sensor 903 and the second exposure parameter is transmitted to the second sensor 904.
The first sensor 903 is used for generating a first image according to the first exposure parameter and transmitting the first image to the DSP 902.
The second sensor 904 is configured to generate a second image according to the second exposure parameter and transmit the second image to the DSP 902.
The DSP902 is configured to perform image synthesis on the first image and the second image to obtain a target image.
Alternatively, in the electronic device shown in fig. 9, the AP901 is connected to the first sensor 903 through an I2C bus, and the AP901 is also connected to the second sensor 904 through an I2C bus.
A first Transmit (Tx) Interface of the AP901 is connected to a first Receive (Rx) Interface of the DSP902 through a Serial Peripheral Interface (SPI) bus; the second transmission interface of the AP901 is connected to a second Receive (Rx) interface of the DSP902 through a General Purpose Input/Output (GPIO) interface bus.
A first receiving interface of the AP901 is connected to a first transmission interface of the DSP902 through an MIPI bus; and a second receiving interface of the AP901 is connected with a second transmission interface of the DSP902 through an MIPI bus.
A third receiving interface of the DSP902 is connected to the first sensor 903 through an MIPI bus; a fourth receiving interface of the DSP902 is connected to the second sensor 904 via a MIPI bus.
The DSP902 is also connected with the first sensor 903 through an I2C bus; the DSP902 is connected to the second sensor 904 via an I2C bus.
Optionally, fig. 10 is a schematic structural diagram of another electronic device provided by the present invention. As shown in fig. 10, the electronic device 900 may further include: a display 905; the display 905 is connected to the AP 901.
A display 905 for displaying the target image acquired by the AP901 from the DSP 902.
The electronic device provided by the present invention can execute the image processing method described in any one of fig. 1 to fig. 6, and the specific implementation process and beneficial effects thereof are referred to above, and are not described herein again.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (14)

1. An image processing method, comprising:
acquiring a preview image generated by a first sensor;
determining a first exposure parameter of the first sensor and a second exposure parameter of the second sensor according to the brightness value of the preview image; wherein a field angle of the first sensor is greater than a field angle of the second sensor;
transmitting the first exposure parameter to the first sensor and the second exposure parameter to the second sensor, so that the first sensor performs exposure according to the first exposure parameter and the second sensor performs exposure according to the second exposure parameter;
the determining a first exposure parameter of the first sensor and a second exposure parameter of the second sensor according to the brightness value of the preview image includes:
determining a photosensitive area of the first sensor overlapping a photosensitive area of the second sensor;
determining brightness values of the overlapping photosensitive regions in the preview image;
and determining the first exposure parameter and the second exposure parameter according to the brightness value of the overlapped photosensitive area.
2. The method of claim 1, wherein determining a photosensitive area of the first sensor that overlaps a photosensitive area of the second sensor comprises:
determining the width of the overlapped photosensitive area according to the field angle of the first sensor, the field angle of the second sensor and the width of the photosensitive area of the first sensor in the following way (1);
determining the height of the overlapped photosensitive area according to the width of the overlapped photosensitive area and a preset aspect ratio;
determining the overlapped photosensitive area according to the width of the overlapped photosensitive area and the height of the overlapped photosensitive area;
tan (α/2)/tan (β/2) ═ l/w equation (1)
Wherein α is the field of view of the first sensor and β is the field of view of the second sensor; l is the width of the overlapping photosensitive area; w is the width of the photosensitive area of the first sensor.
3. The method of claim 1, wherein prior to acquiring the preview image generated by the first sensor, the method further comprises:
receiving an input zooming instruction;
controlling the first camera and the second camera to enter an optical zooming mode according to the zooming instruction; the first camera is the camera corresponding to the first sensor, and the second camera is the camera corresponding to the second sensor.
4. The method of any of claims 1-3, wherein determining a first exposure parameter for the first sensor and a second exposure parameter for a second sensor based on a brightness value of the preview image comprises:
determining the first exposure parameter according to the brightness value of the preview image and a preset first exposure algorithm; wherein the content of the first and second substances,
the first exposure algorithm is an exposure algorithm corresponding to the first sensor;
determining the second exposure parameter according to the brightness value of the preview image and a preset second exposure algorithm; wherein the content of the first and second substances,
the second exposure algorithm is an exposure algorithm corresponding to the second sensor.
5. The method of any of claims 1-3, wherein transmitting the first exposure parameter to the first sensor and the second exposure parameter to the second sensor comprises:
transmitting the first exposure parameter to the first sensor through an inter-integrated circuit I2C bus with the first sensor;
transmitting the second exposure parameter to the second sensor via an I2C bus with the second sensor.
6. The method of any of claims 1-3, wherein the first sensor is an image sensor of a wide angle camera; the second sensor is an image sensor of the far-focus camera.
7. An image processing apparatus characterized by comprising:
the acquisition module is used for acquiring a preview image generated by the first sensor;
the determining module is used for determining a first exposure parameter of the first sensor and a second exposure parameter of the second sensor according to the brightness value of the preview image;
the transmission module is used for transmitting the first exposure parameter to the first sensor and transmitting the second exposure parameter to the second sensor so that the first sensor performs exposure according to the first exposure parameter and the second sensor performs exposure according to the second exposure parameter; wherein a field angle of the first sensor is greater than a field angle of the second sensor;
the determining module is specifically configured to determine a photosensitive area where a photosensitive area of the first sensor overlaps a photosensitive area of the second sensor; determining brightness values of the overlapping photosensitive regions in the preview image; and determining the first exposure parameter and the second exposure parameter according to the brightness value of the overlapped photosensitive area.
8. The apparatus of claim 7,
the determining module is specifically configured to determine, according to the field angle of the first sensor, the field angle of the second sensor, and the width of the photosensitive area of the first sensor, the width of the overlapped photosensitive area by using the following method (1); determining the height of the overlapped photosensitive area according to the width of the overlapped photosensitive area and a preset aspect ratio; determining the overlapped photosensitive area according to the width of the overlapped photosensitive area and the height of the overlapped photosensitive area;
tan (α/2)/tan (β/2) ═ l/w equation (1)
Wherein α is the field of view of the first sensor and β is the field of view of the second sensor; l is the width of the overlapping photosensitive area; w is the width of the photosensitive area of the first sensor.
9. The apparatus of claim 7, further comprising:
the input module is used for receiving an input zooming instruction before the acquisition module acquires the preview image generated by the first sensor;
the control module is used for controlling the first camera and the second camera to enter an optical zooming mode according to the zooming instruction;
the first camera is the camera corresponding to the first sensor, and the second camera is the camera corresponding to the second sensor.
10. The apparatus according to any one of claims 7-9,
the determining module is specifically configured to determine the first exposure parameter according to the brightness value of the preview image and a preset first exposure algorithm; determining the second exposure parameter according to the brightness value of the preview image and a preset second exposure algorithm; the first exposure algorithm is an exposure algorithm corresponding to the first sensor; the second exposure algorithm is an exposure algorithm corresponding to the second sensor.
11. The apparatus according to any one of claims 7-9,
the transmission module is specifically configured to transmit the first exposure parameter to the first sensor through an inter-integrated circuit I2C bus connected to the first sensor; transmitting the second exposure parameter to the second sensor via an I2C bus with the second sensor.
12. The apparatus of any of claims 7-9, wherein the first sensor is an image sensor of a wide angle camera; the second sensor is an image sensor of the far-focus camera.
13. An electronic device, comprising: the system comprises an application processor AP, a data signal processor DSP, a first sensor and a second sensor; the AP is respectively connected with the first sensor and the second sensor; the DSP is respectively connected with the first sensor and the second sensor, and is also connected with the AP; wherein a field angle of the first sensor is greater than a field angle of the second sensor;
the AP is used for acquiring a preview image generated by the first sensor; determining a first exposure parameter of the first sensor and a second exposure parameter of the second sensor according to the brightness value of the preview image; transmitting the first exposure parameter to the first sensor and the second exposure parameter to the second sensor;
the first sensor is used for generating a first image according to the first exposure parameter and transmitting the first image to the DSP;
the second sensor is used for generating a second image according to the second exposure parameter and transmitting the second image to the DSP;
the DSP is used for carrying out image synthesis on the first image and the second image to obtain a target image;
determining a first exposure parameter of the first sensor and a second exposure parameter of the second sensor according to a brightness value of a preview image generated by the first sensor acquired by the AP, including:
firstly, determining a photosensitive area overlapped by a photosensitive area of the first sensor and a photosensitive area of the second sensor; then determining the brightness value of the overlapped photosensitive area in the preview image; and determining the first exposure parameter and the second exposure parameter according to the brightness value of the overlapped photosensitive area.
14. The electronic device of claim 13, further comprising:
a display; the display is connected with the AP;
the display is used for displaying the target image acquired by the AP from the DSP.
CN201611232344.0A 2016-12-28 2016-12-28 Image processing method and device and electronic equipment Active CN107071291B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611232344.0A CN107071291B (en) 2016-12-28 2016-12-28 Image processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611232344.0A CN107071291B (en) 2016-12-28 2016-12-28 Image processing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN107071291A CN107071291A (en) 2017-08-18
CN107071291B true CN107071291B (en) 2020-08-18

Family

ID=59624388

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611232344.0A Active CN107071291B (en) 2016-12-28 2016-12-28 Image processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN107071291B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107592473A (en) * 2017-10-31 2018-01-16 广东欧珀移动通信有限公司 Exposure parameter method of adjustment, device, electronic equipment and readable storage medium storing program for executing
WO2019091423A1 (en) * 2017-11-08 2019-05-16 Fuzhou Rockchip Electronics Co., Ltd. An image‐processing microprocessor for supporting an application processor and multiple cameras
CN109756664B (en) * 2017-11-08 2020-09-11 瑞芯微电子股份有限公司 Intelligent electronic equipment and image processing unit, device and method
CN109981969B (en) * 2017-12-28 2020-05-15 福州瑞芯微电子股份有限公司 Intelligent electronic equipment, image processing unit and image processing method
CN109819139B (en) * 2017-11-22 2022-01-07 瑞芯微电子股份有限公司 Intelligent electronic equipment and image processing unit, device and method
CN110049257B (en) 2019-05-31 2021-09-21 影石创新科技股份有限公司 Method for determining semi-synchronous exposure parameters and electronic device
CN110519540A (en) * 2019-08-29 2019-11-29 深圳市道通智能航空技术有限公司 A kind of image processing method, device, equipment and storage medium
CN114143472A (en) * 2019-09-02 2022-03-04 深圳市道通智能航空技术股份有限公司 Image exposure method and device, shooting equipment and unmanned aerial vehicle
CN112640421A (en) * 2020-03-18 2021-04-09 深圳市大疆创新科技有限公司 Exposure method, exposure device, shooting equipment, movable platform and storage medium
CN111741222B (en) * 2020-07-09 2021-08-24 Oppo(重庆)智能科技有限公司 Image generation method, image generation device and terminal equipment
CN114520870B (en) * 2020-11-20 2023-06-20 华为技术有限公司 Display method and terminal
CN117837160A (en) * 2021-11-05 2024-04-05 深圳市大疆创新科技有限公司 Control method and device for movable platform, movable platform and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2592823A3 (en) * 2011-10-12 2013-06-19 Canon Kabushiki Kaisha Image-capturing device
EP3105920B1 (en) * 2014-02-11 2020-07-22 Robert Bosch GmbH Brightness and color matching video from multiple-camera system
CN104853106B (en) * 2014-02-19 2019-11-26 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN104301624B (en) * 2014-10-30 2018-04-03 青岛海信移动通信技术股份有限公司 A kind of image taking brightness control method and device
CN107454343B (en) * 2014-11-28 2019-08-02 Oppo广东移动通信有限公司 Photographic method, camera arrangement and terminal
CN104994288B (en) * 2015-06-30 2018-03-27 广东欧珀移动通信有限公司 A kind of photographic method and user terminal
CN105635568B (en) * 2015-12-25 2019-09-17 青岛海信移动通信技术股份有限公司 Image processing method and mobile terminal in a kind of mobile terminal

Also Published As

Publication number Publication date
CN107071291A (en) 2017-08-18

Similar Documents

Publication Publication Date Title
CN107071291B (en) Image processing method and device and electronic equipment
CN110012224B (en) Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium
US10178338B2 (en) Electronic apparatus and method for conditionally providing image processing by an external apparatus
US9973672B2 (en) Photographing for dual-lens device using photographing environment determined using depth estimation
US8773509B2 (en) Imaging device, imaging method and recording medium for adjusting imaging conditions of optical systems based on viewpoint images
CN110035228B (en) Camera anti-shake system, camera anti-shake method, electronic device, and computer-readable storage medium
US9996934B2 (en) Device with an adaptive camera array
JP2019030007A (en) Electronic device for acquiring video image by using plurality of cameras and video processing method using the same
JP7197981B2 (en) Camera, terminal device, camera control method, terminal device control method, and program
WO2018053908A1 (en) Dual-camera photographic method, system, and terminal
EP4195650A1 (en) Photographing method and electronic device
JPWO2014106917A1 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
CN104954672A (en) Mobile terminal and manual focusing method thereof
US10911686B2 (en) Zoom control device, zoom control method, and program
JP2010016826A (en) System and method for efficiently performing image processing operations
CN106713737B (en) Electronic device and image processing method
CN105210362B (en) Image adjusting apparatus, image adjusting method, and image capturing apparatus
US20110164867A1 (en) Digital photographing apparatus and method that apply high-speed multi-autofocusing (af)
US20140063327A1 (en) Photographing apparatus and method of photographing image thereof
JP6231757B2 (en) Imaging apparatus, information processing apparatus, control method therefor, and program
US10944899B2 (en) Image processing device and image processing method
CN109309784B (en) Mobile terminal
EP3043547B1 (en) Imaging apparatus, video data transmitting apparatus, video data transmitting and receiving system, image processing method, and program
WO2015060605A1 (en) Operating method and apparatus for detachable lens type camera
KR101153388B1 (en) Camera module and method for controlling auto exposure thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20180301

Address after: 330008 Jiangxi Province, Nanchang Qingshan Lake Economic and Technological Development Zone of Nanchang Jiao Town Office Building Room 319

Applicant after: Nanchang shark Technology Co. Ltd.

Address before: 518063 Guangdong city of Shenzhen province Nanshan District Wei new software park building 1 floor 2 South

Applicant before: SHENZHEN ZEUSIS TECHNOLOGY CO., LTD.

GR01 Patent grant
GR01 Patent grant