WO2022268170A1 - 图像处理方法、装置及电子设备 - Google Patents

图像处理方法、装置及电子设备 Download PDF

Info

Publication number
WO2022268170A1
WO2022268170A1 PCT/CN2022/100841 CN2022100841W WO2022268170A1 WO 2022268170 A1 WO2022268170 A1 WO 2022268170A1 CN 2022100841 W CN2022100841 W CN 2022100841W WO 2022268170 A1 WO2022268170 A1 WO 2022268170A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera module
image
target
exposure
exposure parameter
Prior art date
Application number
PCT/CN2022/100841
Other languages
English (en)
French (fr)
Inventor
刘闯
Original Assignee
维沃移动通信(杭州)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信(杭州)有限公司 filed Critical 维沃移动通信(杭州)有限公司
Publication of WO2022268170A1 publication Critical patent/WO2022268170A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • the present application belongs to the technical field of image processing, and specifically relates to an image processing method, device and electronic equipment.
  • the image acquired by the image acquisition device can be processed to avoid the image acquired by the image acquisition device from being too bright or too dark.
  • AE Automatic Exposure
  • the purpose of the embodiments of the present application is to provide an image processing method, device and electronic equipment, which can solve the problem of long time required for exposure convergence.
  • the embodiment of the present application provides an image processing method, which is applied to an image acquisition device, and the image acquisition device includes a first camera module and a second camera module; the method includes: acquiring a target through the first camera module The brightness information of the image, wherein the target image is an image collected for the target object; according to the brightness information, a target exposure parameter is obtained, wherein the target exposure parameter is an exposure parameter required to make the target image reach a target exposure level.
  • the target exposure parameter is set as the exposure parameter required by the second camera module for exposure processing.
  • the embodiment of the present application provides an image processing device, which is applied to an image acquisition device.
  • the image acquisition device includes a first camera module and a second camera module;
  • the module acquires the brightness information of the target image, wherein the target image is an image collected for the target object;
  • the processing module is used to obtain the target exposure parameter according to the brightness information acquired by the acquisition module, wherein the target exposure parameter is to make the target image reach the target
  • the exposure module is used to set the target exposure parameters obtained by the processing module as the required exposure processing for the second camera module when the second camera module performs image acquisition on the target object exposure parameters.
  • an embodiment of the present application provides an electronic device, the electronic device includes a processor, a memory, and a program or instruction stored in the memory and operable on the processor, and the program or instruction is The processor implements the steps of the method described in the first aspect when executed.
  • an embodiment of the present application provides a readable storage medium, on which a program or an instruction is stored, and when the program or instruction is executed by a processor, the steps of the method described in the first aspect are implemented .
  • the embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run programs or instructions, so as to implement the first aspect the method described.
  • the image acquisition device to which the image processing method is applicable is an image acquisition device including a first camera module and a second camera module.
  • the brightness information of the target image can be acquired through the first camera module.
  • the target image is an image collected for the target object.
  • the target exposure parameter can be obtained.
  • the target exposure parameter is an exposure parameter required to make the target image reach the target exposure level.
  • the target exposure parameter can be set as the exposure parameter required by the second camera module for exposure processing. Therefore, when the second camera module collects an image of the target object, the first camera module has obtained the target exposure parameters required by the target image collected for the target object.
  • the second camera module can directly perform exposure processing according to the target exposure parameters obtained by the first camera module. Therefore, for the image acquisition device including the first camera module and the second camera module, the time required for exposure convergence can be effectively reduced, and the efficiency of exposure convergence can be improved.
  • Fig. 1 is the flow chart of the steps of the image processing method of the embodiment of the present application.
  • FIG. 2 is a schematic diagram of pixels of an image processing device according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of the photoelectric conversion of the image processing device of the embodiment of the present application.
  • FIG. 4 is a pixel array diagram of an image processing device according to an embodiment of the present application.
  • FIG. 5 is a pixel partition diagram of the target image collected by the first camera module according to the embodiment of the present application.
  • Fig. 6 is the range of field of view of the first camera module and the second camera module of the embodiment of the present application;
  • FIG. 7 is a schematic structural diagram of an image processing device according to an embodiment of the present application.
  • FIG. 8 is one of the structural schematic diagrams of the electronic device according to the embodiment of the present application.
  • FIG. 9 is a second structural schematic diagram of an electronic device according to an embodiment of the present application.
  • the execution subject of the image processing method provided in the embodiment of the present application may be an image processing device, and the device may be an electronic device, or a functional module and/or a functional subject in the electronic device, which may be determined according to actual use requirements.
  • This application Examples are not limited.
  • the following method embodiments will be described exemplarily by taking an image processing apparatus as an example for executing the image processing method.
  • an embodiment of the present application provides an image processing method, and the purpose of the method is to improve the speed and efficiency of image exposure processing, thereby improving user experience.
  • the image processing method provided in the embodiment of the present application is applied to an image acquisition device, and the image acquisition device may be a device for realizing image acquisition, or may be an electronic device configured with modules or components for image acquisition.
  • the image acquisition device applicable to the image processing method provided by the embodiment of the present application may be a SLR camera or a single-chip microcomputer, or may be a smart phone, a personal computer or a wearable smart device.
  • the image acquisition device to which the image processing method provided in the embodiment of the present application is applicable may include a first camera module and a second camera module.
  • the first camera module and the second camera module cooperate with each other, and can respectively realize the image acquisition function.
  • the type, structure and size of the first camera module and the second camera module may be the same or different.
  • At least one of the first camera module and the second camera module may be a camera module based on a charge coupled device (Charge Coupled Device, CCD) for imaging, or may be a camera module based on a complementary metal oxide semiconductor (CMOS) Complementary Metal Oxide Semiconductor, CMOS) camera module for imaging.
  • CCD Charge Coupled Device
  • CMOS complementary metal oxide semiconductor
  • CMOS Complementary Metal Oxide Semiconductor
  • the structure of the first camera module and the second camera module will be illustrated below by taking a CMOS camera module as an example.
  • the CMOS camera module is a device widely used in related technologies.
  • the CMOS camera module mainly consists of a lens (Lens), a voice coil motor (Voice Coil Motor, VCM), an infrared filter (IR Filter), a CMOS image sensor, a digital signal processor (Digital Signal Process, DSP), and a soft board (Flexible Printed Circuit, FPC) composition.
  • CMOS image sensors are semiconductors mainly made of two elements, silicon and germanium, on which there are N-level (negatively charged) and P (positively charged)-level semiconductors, and the current generated by their complementary effects can be processed by the chip. recording and interpretation.
  • the surface of the CMOS image sensor has hundreds of thousands to millions of photodiodes, and each photodiode is covered with a micro-lens (Micro-lens) and a color filter array (Color Filter Rarray).
  • the micro-lens is used to guide the light into the photodiode, and the color filter array filters the light, which only allows the light of the wavelength band corresponding to the color of the color filter array to pass through.
  • a photodiode When a photodiode is illuminated, it generates an electric charge, which converts the light into an electrical signal, from which an image is formed.
  • the basic working process of the CMOS camera module is as follows: the voice coil motor drives the lens to the position where the focus is accurate, the external light passes through the lens, is filtered by the infrared filter, and shines on the photosensitive diode (Pixel), and the photosensitive diode will perceive The optical signal is converted into an electrical signal, and the digital signal matrix (ie image) is formed through the amplification circuit and the AD conversion circuit, and then processed by the digital signal processor, compressed and stored.
  • the image processing method provided in the embodiment of the present application includes the following S101 to S103:
  • the image processing device acquires brightness information of a target image through a first camera module.
  • the first camera module may be a camera module with an always-on (AO) function.
  • the AO function can enable the chip to output image data at low resolution, low frame rate, and for a long time.
  • the image acquisition device can do some AI-related applications, such as smart payment, presence perception, face detection, gesture recognition, smart bright screen, QR code scanning, etc.
  • the AO function supports the perception of the existing environment at any time, making the image acquisition equipment evolve from passive intelligence to active intelligence, and can autonomously perceive the environment and actively interact with users.
  • the first camera module may collect brightness information of the target image through the AO function.
  • the first camera module may be a camera module with an AO function and capable of assisting auto-focus, auto-exposure, and color assist functions.
  • the target image is an image collected for the target object.
  • the target object may be a specific target in the environment space, such as a person or an object or a building. It can also be non-specific objects in the environment space, such as scenery.
  • the first camera module since the first camera module has an AO function, no matter whether the image acquisition device is in use, the first camera module can acquire the brightness information of the target image anytime and anywhere.
  • the first camera module in the embodiment of the present application may adopt the following pinned photodiode pixel (Pinned Photodiode Pixel, PPD) structure.
  • PPD pinned Photodiode Pixel
  • the PPD structure of the first camera module includes a photosensitive area of the PPD, that is, a photodiode, a reset switch RST, a control switch TX, a row selector SET, and a source follower SF.
  • the PPD structure allows the introduction of a correlated double sampling (CDS) circuit, thereby eliminating the kTC noise introduced by the reset, and the 1/f noise and offset noise introduced by the mos tube.
  • CDS correlated double sampling
  • FD1, FD2, and FD3 are three capacitors
  • DCG1 and DCG2 are control switches of the three capacitors
  • VDD is a power supply voltage.
  • the above-mentioned pixel circuit can read out the photoelectric signal at different magnitudes by controlling two DCG switches after the PD is exposed to light, thereby realizing the improvement of the dynamic range and the flexible change of the sensitivity.
  • the ratio of FD1, FD2, and FD3 to three capacitors is 1:3:4, then the total capacitance when both DCG1 and DCG2 are off is 1, the total capacitance when DCG2 is off and DCG1 is on is 4, and DCG1 The total capacitance when both DCG2 and DCG2 are on is 8.
  • the sensitivity and dynamic range of the pixel circuit described above can be variable by 8 times.
  • the working mode of the PPD structure of the first camera module is as follows:
  • Exposure The electron-hole pairs generated by light irradiation will be separated due to the existence of the PPD electric field, the electrons will move to the n region, and the holes will move to the p region;
  • RST At the end of exposure, RST can be activated to reset the readout area (n+ area) to high level;
  • Reset level readout After the reset is completed, read out the reset level, which includes the offset noise of the mos tube, 1/f noise and kTC noise introduced by the reset, and store the readout signal in the first capacitor;
  • Charge transfer Activate TX to completely transfer the charge from the photosensitive area to the n+ area for readout.
  • the mechanism here can be understood as similar to the charge transfer in CCD;
  • Signal level readout read out the voltage signal of the n+ region to the second capacitor.
  • the signal here includes the signal generated by photoelectric conversion, the offset generated by the operational amplifier, 1/f noise and kTC noise introduced by reset;
  • Signal output Subtract the signals stored in the two capacitors (such as using CDS, the main noise in the Pixel can be eliminated), the obtained signal is amplified by analog, and then sampled by ADC, then the digital signal can be output .
  • the structural principles of the pixel array and the image sensor of the first camera module are described below.
  • the pixel array of the first camera module includes conventional pixels (that is, pixel A in FIGS. 3 and 5 ) and AO pixels (that is, pixels in FIGS. 3 and 5 ).
  • AO pixels include AO pixels with color information (that is, the pixel area composed of B, Gb, Gr, and R in Figure 4) and pure brightness information AO pixels (that is, the pixel area composed of four Ws in Figure 4).
  • color information that is, the pixel area composed of B, Gb, Gr, and R in Figure 4
  • pure brightness information AO pixels that is, the pixel area composed of four Ws in Figure 4
  • the pure brightness information AO pixel may also have a PD focusing function. Therefore, no matter whether the image acquisition device is turned on or not, the AO pixels will continue to output at a lower frame rate (about 10fps) to realize the corresponding function.
  • the manner in which the first camera module acquires the brightness information of the target image will be further described in detail below.
  • the lens in the first camera module is used to focus and focus, and the lens is wrapped and fixed by the voice coil motor.
  • the upper and lower ends of the voice coil motor are connected with the shrapnel.
  • the motor can generate electromagnetic force by energizing, and the magnetic force is finally balanced with the elastic force of the shrapnel.
  • the position of the motor can be controlled by the magnitude of the energization, and then the purpose of pushing the motor and the lens to the focus position can be achieved.
  • the infrared filter can filter out unnecessary light projected to the image sensor, prevent the image sensor from producing false colors or ripples, and improve its effective resolution and color reductive.
  • the light after passing through the image sensor can be sensed by the image sensor.
  • the image sensor converts the light signal into an electrical signal after sensing light, and after dark current correction is completed, it is amplified and converted into a digital signal by the ADC to form a picture and output it to the image processing system.
  • the image processing system partitions the image through the following S102, and performs automatic exposure algorithm calculation on the brightness information of the AO pixels in each area, and obtains suitable exposure parameters for each area in the field of view for the second camera module group use.
  • the image processing device obtains a target exposure parameter according to the brightness information.
  • the target exposure parameter is an exposure parameter required to make the target image reach the target exposure level.
  • the target exposure parameter refers to a parameter used to adjust the image brightness of the target image (ie, perform exposure processing) in order to ensure a reasonable exposure of the target image (ie, avoid over-exposure or under-exposure).
  • the above exposure processing can make the target image reach exposure balance.
  • the image processing device can obtain exposure parameters required by the target image according to the brightness information through a measurement algorithm.
  • the measurement algorithm evaluates the amount of light incident on the sensor (i.e. brightness information) and calculates the appropriate exposure parameter (Exposure Value, EV) accordingly.
  • the image processing device sets the target exposure parameter as the exposure parameter required by the second camera module for exposure processing when the second camera module performs image acquisition on the target object.
  • the exposure processing can be realized by controlling the hardware.
  • the exposure control mechanism can control the following three related devices according to the obtained exposure parameters: aperture diameter (Aperture Diameter), shutter speed (Shutter Speed) and sensor sensitivity (Sensor sensitivity), and thus obtain the target image desired exposure parameters.
  • the image acquisition device applicable to the image processing method is an image acquisition device including a first camera module and a second camera module.
  • the brightness information of the target image can be acquired through the first camera module.
  • the target image is an image collected for the target object.
  • the target exposure parameter can be obtained.
  • the target exposure parameter is an exposure parameter required to make the target image reach the target exposure level.
  • the target exposure parameter can be set as the exposure parameter required by the second camera module for exposure processing. Therefore, when the second camera module collects an image of the target object, the first camera module has obtained the exposure parameters required by the target image collected for the target object.
  • the second camera module can directly perform exposure processing according to the target exposure parameters obtained by the first camera module. Therefore, for the image acquisition device including the first camera module and the second camera module, the time required for exposure convergence can be effectively reduced, and the efficiency of exposure convergence can be improved.
  • the phase focusing function can be realized through the first camera module, and the focusing information can be recorded in advance, which can assist in quick focusing when switching to the normal drawing mode.
  • the focus information can also assist the AO function to judge motion information.
  • the target image in order to obtain the exposure parameters according to the brightness information, the target image may be partitioned, and then the exposure parameters required by each area may be calculated.
  • the target image includes N pixel areas, each of the N pixel areas corresponds to a brightness value, and the brightness information includes N brightness values.
  • N is an integer greater than or equal to 2.
  • the N pixel regions may be arranged in an array.
  • S101 includes the following S101a:
  • the image processing device acquires the luminance values corresponding to each pixel area through the first camera module.
  • luminance values corresponding to each of the above N pixel regions may be the same or different.
  • the luminance value of each pixel area in the aforementioned N pixel areas can be collected and obtained through the AO function respectively.
  • the first camera module with the PPD structure as shown in FIG. 2 may be used to realize accurate and efficient collection of the brightness value of each pixel area.
  • S102 includes the following S102a:
  • the image processing device obtains target exposure parameters required by each pixel area according to the luminance values corresponding to each pixel area.
  • the target exposure parameters required by each pixel area can be obtained through calculation according to the brightness value of each pixel area.
  • the target image can be divided into 8 ⁇ 8 arrayed pixel areas, assuming that the brightness value of each pixel area is as shown in FIG. 5 . Then, according to the luminance values respectively corresponding to the 8 ⁇ 8 array-type pixel regions, the target exposure parameters required by the respective pixel regions in the 8 ⁇ 8 array-type pixel regions can be obtained.
  • the target exposure parameters required by each area can be calculated separately, so as to achieve the brightness information collected and acquired by the first camera module, so that the second camera module can perform exposure processing quickly and appropriately the goal of.
  • S103 when S101 includes S101a, and S102 includes S102a, S103 includes the following S103a:
  • the image processing device adopts the corresponding relationship between the image collected by the second camera module and the target image, respectively using the P pixel regions in the N pixel regions.
  • the average value of the required exposure parameters is used as the target exposure parameter, and the target exposure parameter is set as the exposure parameter required by the second camera module for exposure processing.
  • P is an integer less than or equal to N and greater than or equal to 2.
  • the P pixel areas are areas within the field of view of the second camera module among the N pixel areas.
  • the area 100 is the field of view of the first camera module
  • the area 200 is the field of view of the second camera module.
  • the area A in FIG. 5 is the area falling within the field of view of the second camera module among the N pixel areas of the target image.
  • Example 2 in the arrayed pixel area provided in Example 1, area A includes four pixel areas (that is, the 4 ⁇ 4 pixel area, 4 ⁇ 5 pixel area, 5 ⁇ 4 pixel area in Figure 5 area, 5 ⁇ 5 pixel area). Then the exposure parameters required by the four pixel areas in area A (i.e. 122 in the 4 ⁇ 4 pixel area, 137 in the 4 ⁇ 5 pixel area, 113 in the 5 ⁇ 4 pixel area, and 5 ⁇ 5 pixel in Figure 5 can be used. The average value of 135) of the region (that is, 126.75, approximately equal to 127), and the exposure process is performed on the region corresponding to region A in the image captured by the second camera module.
  • Example 3 as shown in Figure 5, in the arrayed pixel area provided in Example 1, area B includes four pixel areas (that is, the 5 ⁇ 5 pixel area, 5 ⁇ 6 pixel area, 6 ⁇ 5 pixel area in Figure 5 area, 6 ⁇ 6 pixel area). Then the exposure parameters required by the four pixel areas in area B (i.e. 135 in the 5 ⁇ 5 pixel area, 158 in the 5 ⁇ 6 pixel area in Figure 5, 132 in the 6 ⁇ 5 pixel area, and 6 ⁇ 6 pixel The average value (ie, 148) of the area 167) is used to perform exposure processing on the area corresponding to the area B in the image captured by the second camera module.
  • the exposure parameters required by the four pixel areas in area B i.e. 135 in the 5 ⁇ 5 pixel area, 158 in the 5 ⁇ 6 pixel area in Figure 5, 132 in the 6 ⁇ 5 pixel area, and 6 ⁇ 6 pixel
  • the average value ie, 148) of the area 167) is used to perform exposure processing
  • the purpose of quickly and appropriately exposing the images captured by the second camera module can also be achieved.
  • the correspondence between the image collected by the second camera module and the target image can be made in advance. relationship to predict.
  • the method further includes:
  • the image processing device predicts the corresponding relationship between the image captured by the second camera module and the target image according to the movement trend of the field of view of the second camera module.
  • the purpose of predicting the corresponding relationship between the image captured by the second camera module and the target image is to obtain in advance the image acquired within the field of view after the second camera module moves within the field of view. required exposure parameters.
  • the movement trend of the field of view of the second camera module may include the movement direction, distance and speed of the field of view of the second camera module.
  • the moving direction of the field of view of the second camera module can be judged by a gyroscope, and the moving distance or speed of the field of view of the second camera module can be judged by measuring the sensor or monitoring the VCM.
  • the movement trend of the field of view of the second camera module can be obtained by measurement after the field of view of the second camera module moves; it can also be carried out before the field of view of the second camera module moves Prediction or pre-judgment; it can also be judged during the movement of the field of view of the second camera module, and based on this, the corresponding relationship between the image collected by the second camera module and the target image is predicted synchronously, or exposure is performed synchronously Parameter calculation and acquisition and exposure processing.
  • the field of view of the first camera module covers and is larger than the field of view of the second camera module.
  • the field of view of the first camera module at least partially overlaps with the field of view of the second camera module.
  • the first camera module may adopt a wide-angle lens
  • the second camera module may adopt a conventional lens.
  • FOV Field of View
  • the field of view of the conventional lens is in the central area of the field of view of the wide-angle lens.
  • the area 100 is the field of view of the wide-angle lens
  • the area 200 is the field of view of the conventional lens. Then, when the field of view of the conventional lens moves from the center to the edge, the exposure parameters required for exposure can be configured for the conventional lens in advance.
  • the pixel area division method of the image acquired by the wide-angle lens in FIG. 6 is consistent with that in FIG.
  • the trend of the field of view of the conventional lens moving from the center to the edge is the trend of moving from area A to area B in FIG. 5 .
  • the average value of exposure parameters in area A is 126
  • the average value of exposure parameters in area B is 148. Therefore, when it is judged by the gyroscope that the field of view of the conventional lens moves from the A region to the B region, the exposure can be completed by directly applying the average value of the exposure parameters of the B region. Even the exposure parameters can be updated synchronously during the movement.
  • the exposure parameters required for capturing images within the field of view of the second camera module after the field of view moves can be obtained in advance, thereby further improving exposure efficiency.
  • the image processing method provided in the embodiment of the present application may be executed by an image processing device, or a control module in the image processing device for executing the image processing method.
  • the image processing device executed by the image processing device is taken as an example to describe the image processing device provided in the embodiment of the present application.
  • an embodiment of the present application provides an image processing apparatus 200 , which is applied to an image acquisition device, and the image acquisition device includes a first camera module and a second camera module.
  • the image processing device 200 includes:
  • the acquiring module 210 is configured to acquire brightness information of the target image through the first camera module, wherein the target image is an image collected for the target object.
  • the processing module 220 is configured to obtain a target exposure parameter according to the brightness information acquired by the acquiring module 210, wherein the target exposure parameter is an exposure parameter required to make the target image reach a target exposure level.
  • the exposure module 230 is configured to set the target exposure parameter obtained by the processing module 220 as the exposure parameter required by the second camera module for exposure processing when the second camera module performs image acquisition on the target object.
  • the image acquisition device suitable for the image processing apparatus 200 is an image acquisition device including a first camera module and a second camera module.
  • the brightness information of the target image can be acquired through the first camera module.
  • the target image is an image collected for the target object.
  • the image processing device 200 can obtain the target exposure parameter according to the brightness information.
  • the target exposure parameter is an exposure parameter required to make the target image reach the target exposure level.
  • the target exposure parameter can be set as the exposure parameter required by the second camera module for exposure processing. Therefore, when the second camera module collects an image of the target object, the first camera module has obtained the target exposure parameters required by the target image collected for the target object.
  • the second camera module can directly perform exposure processing according to the target exposure parameters obtained by the first camera module. Therefore, for an image acquisition device including the first camera module and the second camera module, the image processing device 200 can effectively reduce the time required for exposure convergence and improve the efficiency of exposure convergence.
  • the target image includes N pixel areas, each of the N pixel areas corresponds to a brightness value, and the brightness information includes N brightness values.
  • the obtaining module 210 is specifically used for:
  • the brightness value corresponding to each pixel area is obtained through the first camera module.
  • the processing module 220 is specifically used for:
  • the exposure parameters required by each pixel area are obtained.
  • N is an integer greater than or equal to 2.
  • the exposure module 230 is specifically used for:
  • the average value of the exposure parameters required by the P pixel areas in the N pixel areas is used as the target exposure parameter, and the target exposure parameter is set as the first Exposure parameters required for exposure processing by the camera module;
  • P is an integer less than or equal to N and greater than or equal to 2
  • the P pixel areas are areas within the field of view of the second camera module among the N pixel areas.
  • the device when the field of view of the second camera module moves, the device further includes:
  • the prediction module 240 is configured to use the average value of the exposure parameters required by the P pixel areas in the N pixel areas as the target exposure parameter according to the corresponding relationship between the image collected by the second camera module and the target image, and Before setting the target exposure parameter as the exposure parameter required by the second camera module for exposure processing, according to the movement trend of the field of view range of the second camera module, the corresponding relationship between the image collected by the second camera module and the target image Make predictions.
  • the field of view of the first camera module covers and is larger than the field of view of the second camera module.
  • the field of view of the first camera module at least partially overlaps with the field of view of the second camera module.
  • the image processing apparatus in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal.
  • the device may be a mobile electronic device or a non-mobile electronic device.
  • the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle electronic device, a wearable device, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook or a personal digital assistant (personal digital assistant).
  • non-mobile electronic devices can be servers, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (television, TV), teller machine or self-service machine, etc., this application Examples are not specifically limited.
  • Network Attached Storage NAS
  • personal computer personal computer, PC
  • television television
  • teller machine or self-service machine etc.
  • the image processing device provided in the embodiment of the present application can implement various processes implemented in the method embodiment in FIG. 1 , and details are not repeated here to avoid repetition.
  • the embodiment of the present application further provides an electronic device 800, including a processor 801, a memory 802, and programs or instructions stored in the memory 802 and operable on the processor 801,
  • an electronic device 800 including a processor 801, a memory 802, and programs or instructions stored in the memory 802 and operable on the processor 801,
  • the program or instruction is executed by the processor 801
  • each process of the above-mentioned image processing method embodiment can be realized, and the same technical effect can be achieved, so in order to avoid repetition, details are not repeated here.
  • the electronic devices in the embodiments of the present application include the above-mentioned mobile electronic devices and non-mobile electronic devices.
  • FIG. 9 is a schematic diagram of a hardware structure of an electronic device 900 implementing an embodiment of the present application.
  • the electronic device 900 includes, but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, and a processor 910, etc. part.
  • the electronic device 900 can also include a power supply (such as a battery) for supplying power to various components, and the power supply can be logically connected to the processor 910 through the power management system, so that the management of charging, discharging, and function can be realized through the power management system. Consumption management and other functions.
  • a power supply such as a battery
  • the structure of the electronic device shown in FIG. 9 does not constitute a limitation to the electronic device.
  • the electronic device may include more or fewer components than shown in the figure, or combine some components, or arrange different components, which will not be repeated here. .
  • the input unit 904 may include a graphics processor (Graphics Processing Unit, GPU) 9041 and a microphone 9042, and the graphics processor 9041 is used for the image capture device (such as the image data of the still picture or video obtained by the camera) for processing.
  • the display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 907 includes a touch panel 9071 and other input devices 9072 .
  • the touch panel 9071 is also called a touch screen.
  • the touch panel 9071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 9072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which will not be repeated here.
  • the memory 909 can be used to store software programs as well as various data, including but not limited to application programs and operating systems.
  • the processor 910 may integrate an application processor and a modem processor, wherein the application processor mainly processes operating systems, user interfaces, and application programs, and the modem processor mainly processes wireless communications. It can be understood that the foregoing modem processor may not be integrated into the processor 910 .
  • the embodiment of the present application also provides a readable storage medium, the readable storage medium stores a program or an instruction, and when the program or instruction is executed by a processor, each process of the above-mentioned image processing method embodiment is realized, and can achieve the same To avoid repetition, the technical effects will not be repeated here.
  • the processor is the processor in the electronic device described in the above embodiments.
  • the readable storage medium includes computer readable storage medium, such as computer read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
  • the embodiment of the present application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run programs or instructions to implement the various processes of the above method embodiments , and can achieve the same technical effect, in order to avoid repetition, it will not be repeated here.
  • chips mentioned in the embodiments of the present application may also be called system-on-chip, system-on-chip, system-on-a-chip, or system-on-a-chip.
  • the term “comprising”, “comprising” or any other variation thereof is intended to cover a non-exclusive inclusion such that a process, method, article or apparatus comprising a set of elements includes not only those elements, It also includes other elements not expressly listed, or elements inherent in the process, method, article, or device. Without further limitations, an element defined by the phrase “comprising a " does not preclude the presence of additional identical elements in the process, method, article, or apparatus comprising that element.
  • the scope of the methods and devices in the embodiments of the present application is not limited to performing functions in the order shown or discussed, and may also include performing functions in a substantially simultaneous manner or in reverse order according to the functions involved. Functions are performed, for example, the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

本申请公开了一种图像处理方法、装置及电子设备,该图像处理方法,应用于图像采集设备,图像采集设备包括第一摄像模组和第二摄像模组;该方法包括:通过第一摄像模组获取目标图像的亮度信息,其中,目标图像为针对目标对象采集的图像;根据亮度信息,得到目标曝光参数,其中,目标曝光参数为使目标图像达到目标曝光程度所需的曝光参数;在第二摄像模组针对目标对象进行图像采集的情况下,将目标曝光参数设置为第二摄像模组进行曝光处理所需的曝光参数。

Description

图像处理方法、装置及电子设备
相关申请的交叉引用
本申请主张在2021年06月24日在中国提交的中国专利申请号202110704355.9的优先权,其全部内容通过引用包含于此。
技术领域
本申请属于图像处理的技术领域,具体涉及一种图像处理方法、装置及电子设备。
背景技术
随着科技的发展,图像采集设备以及配备有图像采集设备的电子设备受到了用户的广泛欢迎。
对于图像采集设备而言,通过自动曝光(Automatic Exposure,AE)功能,可以对图像采集设备采集获得的图像进行处理,以避免图像采集设备采集获得的图像过明或者过暗。
为了实现上述自动曝光功能,则要不断地计算每帧图像,以判断其是否达到合理的曝光目标。因此,曝光收敛的过程通常会重复4帧至5帧。以每秒传输帧数(Frames Per Second,FPS)为30为例,则曝光收敛所需的时间则为150ms。
通过上述分析可知,如何降低曝光收敛所需的时间,提高曝光收敛的效率,是本领域的技术人员亟待解决的技术问题。
发明内容
本申请实施例的目的是提供一种图像处理方法、装置及电子设备,能够解决曝光收敛所需的时间较长的问题。
第一方面,本申请实施例提供了一种图像处理方法,应用于图像采集设备,图像采集设备包括第一摄像模组和第二摄像模组;该方法包括:通过第一摄像模组获取目标图像的亮度信息,其中,目标图像为针对目标对象采集的图像;根据亮度信息,得到目标曝光参数,其中,目标曝光参数为使目标图像达到目标曝光程度所需的曝光参数。在第二摄像模组针对目标对象进行图像采集的情况下,将目标曝光参数设置为第二摄像模组进行曝光处理所需的曝光参数。
第二方面,本申请实施例提供了一种图像处理装置,应用于图像采集设备,图像采集设备包括第一摄像模组和第二摄像模组;装置包括:获取模块,用于通过第一摄像模组获取目标图像的亮度信息,其中,目标图像为针对目标对象采集的图像;处理模块,用于根据获取模块获取的亮度信息,得到目标曝光参数,其中,目标曝光参数为使目标图像达到目标曝光程度所需的曝光参数;曝光模块,用于在第二摄像模组针对目标对象进行图像采集的情况下,将处理模块得到的目标曝光参数设置为第二摄像模组进行曝光处理所需的曝光参数。
第三方面,本申请实施例提供了一种电子设备,该电子设备包括处理器、存储器 及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如第一方面所述的方法的步骤。
第四方面,本申请实施例提供了一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如第一方面所述的方法的步骤。
第五方面,本申请实施例提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如第一方面所述的方法。
在本申请实施例中,该图像处理方法适用的图像采集设备为包括第一摄像模组和第二摄像模组的图像采集设备。通过第一摄像模组可以获取目标图像的亮度信息。其中,该目标图像为针对目标对象采集的图像。进而,根据亮度信息,则可以得到目标曝光参数。其中,目标曝光参数为使目标图像达到目标曝光程度所需的曝光参数。由此,在第二摄像模组针对目标对象进行图像采集的情况下,可以将目标曝光参数设置为第二摄像模组进行曝光处理所需的曝光参数。因此,在第二摄像模组针对目标对象进行图像采集时,第一摄像模组已经得到了针对目标对象采集的目标图像所需的目标曝光参数。由此,第二摄像模组可以根据第一摄像模组得到的目标曝光参数,直接进行曝光处理。因而,对于包括第一摄像模组和第二摄像模组的图像采集设备而言,可以有效地降低曝光收敛所需的时间,提高曝光收敛的效率。
附图说明
图1是本申请实施例的图像处理方法的步骤流程图;
图2是本申请实施例的图像处理装置的像素原理图;
图3是本申请实施例的图像处理装置的光电转换原理图;
图4是本申请实施例的图像处理装置的像素阵列图;
图5是本申请实施例的通过第一摄像模组采集的目标图像的像素分区图;
图6是本申请实施例的第一摄像模组和第二摄像模组的视场范围;
图7是本申请实施例的图像处理装置的结构示意图;
图8是本申请实施例的电子设备的结构示意图之一;
图9是本申请实施例的电子设备的结构示意图之二。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书中的术语“第一”、“第二”等是用于区别类似的对象,而不用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施,且“第一”、“第二”等所区分的对象通常为一类,并不限定对象的个数,例如第一对象可以是一个,也可以是多个。此外,说明书以及权利要求中“和/或”表示所连接对象的至少其中之一,字符“/”,一般表示前后关联对象是一种“或”的关系。
下面结合附图,通过具体的实施例及其应用场景对本申请实施例提供的图像处理方 法、装置及电子设备进行详细地说明。
本申请实施例提供的图像处理方法的执行主体可以为图像处理装置,该装置可以为电子设备,也可以为电子设备中的功能模块和/或功能主体,具体可以根据实际使用需求确定,本申请实施例不作限定。为了更加清楚地描述本申请实施例提供的图像处理方法,下面方法实施例中以图像处理方法的执行主体为图像处理装置为例进行示例性地说明。
下面以各个实施例为例,对本申请实施例提供的方法进行详细的说明。
如图1所示,本申请实施例提供了一种图像处理方法,该方法的目的在于提高对图像进行曝光处理的速度和效率,并由此提高用户体验。
本申请实施例提供的图像处理方法应用于图像采集设备,该图像采集设备可以为用于实现图像采集的设备,也可以为配置由图像采集的模块或部件的电子设备。
示例性地,本申请实施例提供的图像处理方法适用的图像采集设备可以为单反相机或单片机,也可以为智能手机、个人电脑或可穿戴智能设备。
本申请实施例提供的图像处理方法适用的图像采集设备可以包括第一摄像模组和第二摄像模组。其中,第一摄像模组和第二摄像模组相互配合,并分别能够实现图像采集功能。第一摄像模组和第二摄像模组的种类、结构和尺寸可以相同,也可以不同。
示例性地,第一摄像模组和第二摄像模组中的至少一者可以为基于电荷耦合器(Charge Coupled Device,CCD)进行成像的摄像模组,也可以为基于互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)进行成像的摄像模组。
下面以基于CMOS摄像模组为例,对第一摄像模组和第二摄像模组的结构进行举例说明。
CMOS摄像模组是相关技术中广泛使用的器件。CMOS摄像模组主要由镜头(Lens)、音圈马达(Voice Coil Motor,VCM)、红外滤光片(IR Filter)、CMOS图像传感器、数字信号处理器(Digital Signal Process,DSP),以及软板(Flexible Printed Circuit,FPC)构成。
CMOS图像传感器属于主要利用硅和锗两种元素做成的半导体,其上共存着带N级(带负电)和P(带正电)级的半导体,其互补效应所产生的电流可被处理芯片纪录和解读。
CMOS图像传感器的表面具有几十万到几百万个光电二极管,每个光电二极管上覆盖着微镜头(Micro-lens)和颜色过滤阵列(Color Filter Rarray)。微镜头用来将光线导入光电二极管,而颜色过滤阵列将光线进行滤波,其只允许与该颜色过滤阵列颜色所对应的波段光线通过。光电二极管受到光照就会产生电荷,将光线转换成电信号,并由此成像。
CMOS摄像模组的基本工作流程为:音圈马达带动镜头达到对焦准确的位置,外部光线穿过镜头,经过红外滤光片的滤光,照射到感光二极管(Pixel)上,感光二极管将感知的光信号转换成电信号,通过放大电路,AD转换电路,形成数字信号矩阵(即图像),再经过数字信号处理器的处理,压缩存储起来。
本申请实施例提供的图像处理方法包括以下的S101至S103:
S101、图像处理装置通过第一摄像模组获取目标图像的亮度信息。
本申请实施例中,第一摄像模组可以为具有永远在线(Always on,AO)功能的摄像模组。AO功能可以使得芯片可以低分辨率、低帧率、长时间地输出图像数据。结合AO功能,图像采集设备可以做一些跟AI相关的应用,例如智感支付、存在感知、人脸检测、 手势识别、智慧亮屏、二维码扫描等。AO功能支持随时感知现有环境,使得图像采集设备从被动智能向主动智能进化,并可以自主感知环境,与用户进行主动交互。
可选地,本申请实施例中,第一摄像模组可以为通过AO功能,实施采集目标图像的亮度信息。
进一步地,第一摄像模组可以为具有AO功能,并能够实现辅助自动对焦、自动曝光、颜色辅助功能的摄像模组。
本申请实施例中,目标图像为针对目标对象采集的图像。
可以理解,目标对象可以为环境空间中的特定目标,比如人像或物体或建筑。也可以为环境空间中的非特定目标,比如景色。
可以理解,由于第一摄像模组具有AO功能,因此不论图像采集设备是否处于使用状态,第一摄像模组均可以随时随地获取目标图像的亮度信息。
为了更好地实现亮度信息的采集获取,本申请实施例的第一摄像模组可以采用如下的固定光电二极管像素(Pinned Photodiode Pixel,PPD)结构。
如图2所示,第一摄像模组的PPD结构包括一个PPD的感光区,即感光二极管,以及一个复位开关RST,一个控制开关TX,一个行选择器SET,一个源跟随器SF。其中,PPD结构允许相关双采样(CDS)电路的引入,由此消除了复位引入的kTC噪声,以及mos管引入的1/f噪声和offset噪声。在图2中,FD1、FD2、FD3为三个电容,DCG1和DCG2为三个电容的控制开关,VDD为电源电压。
上述像素电路可以在PD感光后,通过控制两个DCG开关,让光电信号以不同的量级读出,由此实现动态范围的提升和敏感度的灵活变化。
示例性地,FD1、FD2、FD3为三个电容的比例为1:3:4,那么DCG1和DCG2都断开时的总电容为1,DCG2断开DCG1接通时的总电容为4,DCG1和DCG2都接通时的总电容为8。由此,上述像素电路的敏感度和动态范围的可变范围可以为8倍。
第一摄像模组的PPD结构的工作方式如下:
1.曝光:光照射产生的电子-空穴对会因PPD电场的存在而分开,电子移向n区,空穴移向p区;
2.复位:在曝光结束时,可以激活RST,以将读出区(n+区)复位到高电平;
3.复位电平读出:复位完成后,读出复位电平,其中包含mos管的offset噪声,1/f噪声以及复位引入的kTC噪声,将读出的信号存储在第一个电容中;
4.电荷转移:激活TX,将电荷从感光区完全转移到n+区用于读出,此处的机制可以理解为类似于CCD中的电荷转移;
5.信号电平读出:将n+区的电压信号读出到第二个电容。此处的信号包括光电转换产生的信号,运放产生的offset,1/f噪声以及复位引入的kTC噪声;
6.信号输出:将存储在两个电容中的信号相减(如采用CDS,即可消除Pixel中的主要噪声),得到的信号在经过模拟放大,然后经过ADC采样,即可进行数字化信号输出。
为了说明第一摄像模组如何通过AO功能随时随地获取目标图像的亮度信息,下面对第一摄像模组的像素阵列和图像传感器的结构原理进行说明。
示例性地,如图3至图5所示,第一摄像模组的像素阵列中包括常规像素(即图3和图5中的像素A)和AO像素(即图3和图5中的像素B)。AO像素则包括带色彩信息 的AO像素(即图4中的B、Gb、Gr、R构成的像素区域)和纯亮度信息AO像素(即图4中的四个W构成的像素区域)。在图4中,16×16的范围内可以有8个AO像素。其中,纯亮度信息AO像素除了可以获取亮度信息以外,也可以具有PD对焦功能。由此,不论图像采集设备是否开启,AO像素都会以较低的帧率(10fps左右)持续输出,以实现对应功能。
下面对第一摄像模组获取目标图像的亮度信息的方式进行进一步地详细说明。
第一摄像模组中的镜头用来实现聚光和对焦,镜头被音圈马达所包裹固定。其中,音圈马达的上下两端与弹片链接。在实现对焦时,可以通过通电,让马达产生电磁力,该磁力最终与弹片的弹力保持平衡。由此,马达的位置可以通过通电量的大小来进行控制,进而达到将马达和镜头推到合焦位置的目的。
外部光线投射向第一摄像模组中的红外滤光片后,红外滤光片可以滤除投射向图像传感器的不必要光线,防止图像传感器产生伪色或波纹,以提高其有效分辨率和色彩还原性。通过图像传感器后的光线就可以被图像传感器感知。
图像传感器感光后将光信号转换为电信号,完成暗电流矫正后经过放大并被ADC转换为数字信号,形成生图输出给图像处理系统。
进而,图像处理系统通过如下的S102将图分区,并对每个区域中的AO像素的亮度信息进行自动曝光算法计算,获取视场范围中每个区域适合的曝光参数,以供第二摄像模组使用。
S102、图像处理装置根据亮度信息,得到目标曝光参数。
本申请实施例中,目标曝光参数为使目标图像达到目标曝光程度所需的曝光参数。
具体而言,目标曝光参数是指为了保证目标图像的曝光程度合理(即避免其过度曝光或曝光不足)而采用的对目标图像的成像明暗程度进行调节(即进行曝光处理)的参数。
上述曝光处理能够使得目标图像达到曝光平衡。示例性地,图像处理装置可以通过测量算法根据亮度信息,得到目标图像所需的曝光参数。测量算法可以评估入射到传感器的光量(即亮度信息),并据此计算适当的曝光参数(Exposure Value,EV)。
S103、图像处理装置在第二摄像模组针对目标对象进行图像采集的情况下,将目标曝光参数设置为第二摄像模组进行曝光处理所需的曝光参数。
本申请实施例中,可以通过对硬件的控制,实现曝光处理。示例性地,曝光控制机制可以依据所获得的曝光参数来控制以下三个相关装置:光圈直径(Aperture Diameter),快门速度(Shutter Speed)以及传感器灵敏度(Sensorsen sitivity),并由此得到目标图像所需的曝光参数。
本申请实施例中,该图像处理方法适用的图像采集设备为包括第一摄像模组和第二摄像模组的图像采集设备。通过第一摄像模组可以获取目标图像的亮度信息。其中,该目标图像为针对目标对象采集的图像。进而,根据亮度信息,则可以得到目标曝光参数。其中,目标曝光参数为使目标图像达到目标曝光程度所需的曝光参数。由此,在第二摄像模组针对目标对象进行图像采集的情况下,可以将目标曝光参数设置为第二摄像模组进行曝光处理所需的曝光参数。因此,在第二摄像模组针对目标对象进行图像采集时,第一摄像模组已经得到了针对目标对象采集的目标图像所需的曝光参数。由此,第二摄像模组可以根据第一摄像模组得到的目标曝光参数,直接进行曝光处理。因而,对于包括第一摄像模组和 第二摄像模组的图像采集设备而言,可以有效地降低曝光收敛所需的时间,提高曝光收敛的效率。
此外,本申请实施例中,不仅可以实现更大动态范围下的颜色信息和亮度信息采集获取,以便在切换到常规出图模式时实现自动白平衡(Automatic White Balance,AWB)和快速自动曝光,还能够在执行AO功能时具备更大的敏感度和动态范围,对于AO功能的运动判断和响应速度提升有极大帮助。
最后,本申请实施例中,可以通过第一摄像模组实现相位对焦功能,提前记录对焦信息,可以辅助在切换到常规出图模式时进行快速对焦。另外,对焦信息也可以辅助AO功能判断运动信息。
可选地,本申请实施例中,为了根据亮度信息获取曝光参数,可以对目标图像进行分区,进而对各个区域所需的进行曝光参数的计算。
示例性地,目标图像包括N个像素区,N个像素区中的每个像素区分别对应一个亮度值,亮度信息包括N个亮度值。
其中,N为大于或等于2的整数。
可选地,本申请实施例中,N个像素区可以采用阵列的形式排布。
基于上述对目标图像的划分,S101包括以下的S101a:
S101a、图像处理装置通过第一摄像模组,获取每个像素区分别对应的亮度值。
可以理解,上述N个像素区中的每个像素区分别对应的亮度值可以相同,也可以不同。
可以理解,上述N个像素区中的每个像素区的亮度值可以分别通过AO功能采集获得。
可选地,本申请实施例中,可以采用具有如图2所示的PPD结构的第一摄像模组,实现对每个像素区的亮度值的精准高效采集。
基于上述对目标图像的划分,S102包括以下的S102a:
S102a、图像处理装置根据每个像素区分别对应的亮度值,得到每个像素区分别所需的目标曝光参数。
可以理解,每个像素区分别所需的目标曝光参数可以通过根据每个像素区的亮度值,经过计算获得。
例1,可以将目标图像划分为8×8的阵列式像素区,假设每个像素区的亮度值如图5所示。那么,可以根据8×8的阵列式像素区分别对应的亮度值,得到该8×8的阵列式像素区中各个像素区分别所需的目标曝光参数。
如此,通过对目标图像进行分区,可以对各个区域所需的目标曝光参数进行分别计算,以达到根据第一摄像模组采集获取的亮度信息,使得第二摄像模组进行快速而适宜地曝光处理的目的。
可选地,本申请实施例中,在S101包括S101a,S102包括S102a的情况下,S103包括以下的S103a:
S103a、图像处理装置在第二摄像模组针对目标对象进行图像采集的情况下,根据第二摄像模组采集的图像与目标图像的对应关系,采用N个像素区中的P个像素区分别所需的曝光参数的平均值,作为目标曝光参数,并将目标曝光参数设置为第二摄像模组进行曝光处理所需的曝光参数。
其中,P为小于或等于N且大于或等于2的整数。
本申请实施例中,P个像素区为N个像素区中落入第二摄像模组的视场范围的区域。
例如,如图6所示,区域100为第一摄像模组的视场范围,区域200为第二摄像模组的视场范围。相应地,如图5所示,图5中的A区域(即目标图像的N个像素区中落入第二摄像模组的视场范围的区域)为上述P个像素区。
例2,如图5所示,在例1提供的阵列式像素区中,区域A包含了四个像素区(即图5中的4×4像素区、4×5像素区、5×4像素区、5×5像素区)。则可以采用区域A中四个像素区分别所需的曝光参数(即图5中的4×4像素区的122、4×5像素区的137、5×4像素区的113、5×5像素区的135)的平均值(即126.75,约等于127),对通过第二摄像模组采集的图像中与区域A对应的区域进行曝光处理。
例3,如图5所示,在例1提供的阵列式像素区中,区域B包含了四个像素区(即图5中的5×5像素区、5×6像素区、6×5像素区、6×6像素区)。则可以采用区域B中四个像素区分别所需的曝光参数(即图5中的5×5像素区的135、5×6像素区的158、6×5像素区的132、6×6像素区的167)的平均值(即148),对通过第二摄像模组采集的图像中与区域B对应的区域进行曝光处理。
如此,在第二摄像模需要通过移动视场方位或视场范围而进行图像采集的情况下,也能够实现对第二摄像模组采集的图像进行快速而适宜地曝光处理的目的。
可选地,本申请实施例中,为了进一步提高自动曝光的效率,可以在对通过第二摄像模组采集的图像进行曝光处理之前,提前对第二摄像模组采集的图像与目标图像的对应关系进行预测。
示例性地,在第二摄像模组的视场范围移动的情况下,在S103a之前,该方法还包括:
S104、图像处理装置根据第二摄像模组的视场范围的移动趋势,对第二摄像模组采集的图像与目标图像的对应关系进行预测。
本申请实施例中,对第二摄像模组采集的图像与目标图像的对应关系进行预测的目的在于:提前获取第二摄像模组在视场范围移动后,其视场范围内采集获得的图像所需要的曝光参数。
可以理解,第二摄像模组的视场范围的移动趋势可以包括第二摄像模组的视场范围的移动方向、移动距离、移动速度。
示例性地,第二摄像模组的视场范围的移动方向可以通过陀螺仪进行判断,第二摄像模组的视场范围的移动距离或速度可以通过传感器测量或对VCM的监测进行判断。
本申请实施例中,第二摄像模组的视场范围的移动趋势可以在第二摄像模组的视场范围移动后通过测量获得;也可以在第二摄像模组的视场范围移动前进行预测或预判;还可以在第二摄像模组的视场范围移动的过程中进行判断,并据此同步进行第二摄像模组采集的图像与目标图像的对应关系进行预测,或同步进行曝光参数的计算获取以及曝光处理。
可选地,本申请实施例中,第一摄像模组的视场范围覆盖且大于第二摄像模组的视场范围。
可选地,本申请实施例中,第一摄像模组的视场范围与第二摄像模组的视场范围至少部分重叠。
示例性地,第一摄像模组可以采用广角镜头,第二摄像模组可以采用常规镜头。假设常规镜头的视场角(Field of View,FOV)是广角镜头的视场角的四分之一,并且常规镜 头的视场范围于广角镜头的视场范围的中心区域。如图6所示,区域100为广角镜头的视场范围,区域200为常规镜头的视场范围。那么,当常规镜头的视场范围由中心往边缘移动时,就可以提前为常规镜头配置好其曝光所需的曝光参数。图6中广角镜头采集获得的图像的像素区划分方式与图5一致。则常规镜头的视场范围由中心往边缘移动的趋势即为由图5中A区域向B区域移动的趋势。A区域的曝光参数平均值为126,B区域的曝光参数平均值为148。因此,在通过陀螺仪来判断常规镜头的视场范围由A区域移动至B区域的情况下,则直接应用该B区域的曝光参数平均值,即可完成曝光。甚至也可以在移动过程中,同步更新曝光参数。
如此,可以提前获取第二摄像模组在视场范围移动后,其视场范围内采集图像时所需要的曝光参数,并由此进一步提升曝光效率。
需要说明的是,本申请实施例提供的图像处理方法,执行主体可以为图像处理装置,或者该图像处理装置中的用于执行图像处理方法的控制模块。本申请实施例中以图像处理装置执行图像处理方法为例,说明本申请实施例提供的图像处理装置。
如图7所示,本申请实施例提供了一种图像处理装置200,应用于图像采集设备,图像采集设备包括第一摄像模组和第二摄像模组。图像处理装置200包括:
获取模块210,用于通过第一摄像模组获取目标图像的亮度信息,其中,目标图像为针对目标对象采集的图像。
处理模块220,用于根据获取模块210获取的亮度信息,得到目标曝光参数,其中,目标曝光参数为使目标图像达到目标曝光程度所需的曝光参数。
曝光模块230,用于在第二摄像模组针对目标对象进行图像采集的情况下,将处理模块220得到的目标曝光参数设置为第二摄像模组进行曝光处理所需的曝光参数。
本申请实施例中,该图像处理装置200适用的图像采集设备为包括第一摄像模组和第二摄像模组的图像采集设备。通过第一摄像模组可以获取目标图像的亮度信息。其中,该目标图像为针对目标对象采集的图像。进而,图像处理装置200根据亮度信息,则可以得到目标曝光参数。其中,目标曝光参数为使目标图像达到目标曝光程度所需的曝光参数。由此,在第二摄像模组针对目标对象进行图像采集的情况下,可以将目标曝光参数设置为第二摄像模组进行曝光处理所需的曝光参数。因此,在第二摄像模组针对目标对象进行图像采集时,第一摄像模组已经得到了针对目标对象采集的目标图像所需的目标曝光参数。由此,第二摄像模组可以根据第一摄像模组得到的目标曝光参数,直接进行曝光处理。因而,对于包括第一摄像模组和第二摄像模组的图像采集设备而言,图像处理装置200可以有效地降低曝光收敛所需的时间,提高曝光收敛的效率。
可选地,本申请实施例中,目标图像包括N个像素区,N个像素区中的每个像素区分别对应一个亮度值,亮度信息包括N个亮度值。
获取模块210具体用于:
通过第一摄像模组,获取每个像素区分别对应的亮度值。
处理模块220具体用于:
根据每个像素区分别对应的亮度值,得到每个像素区分别所需的曝光参数。
其中,N为大于或等于2的整数。
可选地,本申请实施例中,曝光模块230具体用于:
根据第二摄像模组采集的图像与目标图像的对应关系,采用N个像素区中的P个像素区分别所需的曝光参数的平均值,作为目标曝光参数,并将目标曝光参数设置为第二摄像模组进行曝光处理所需的曝光参数;
其中,P为小于或等于N且大于或等于2的整数,P个像素区为N个像素区中落入第二摄像模组的视场范围的区域。
可选地,本申请实施例中,在第二摄像模组的视场范围移动的情况下,装置还包括:
预测模块240,用于在根据第二摄像模组采集的图像与目标图像的对应关系,采用N个像素区中的P个像素区分别所需的曝光参数的平均值,作为目标曝光参数,并将目标曝光参数设置为第二摄像模组进行曝光处理所需的曝光参数之前,根据第二摄像模组的视场范围的移动趋势,对第二摄像模组采集的图像与目标图像的对应关系进行预测。
可选地,本申请实施例中,第一摄像模组的视场范围覆盖且大于第二摄像模组的视场范围。
可选地,本申请实施例中,第一摄像模组的视场范围与第二摄像模组的视场范围至少部分重叠。
本申请实施例中的图像处理装置可以是装置,也可以是终端中的部件、集成电路、或芯片。该装置可以是移动电子设备,也可以为非移动电子设备。示例性的,移动电子设备可以为手机、平板电脑、笔记本电脑、掌上电脑、车载电子设备、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA)等,非移动电子设备可以为服务器、网络附属存储器(Network Attached Storage,NAS)、个人计算机(personal computer,PC)、电视机(television,TV)、柜员机或者自助机等,本申请实施例不作具体限定。
本申请实施例提供的图像处理装置能够实现图1的方法实施例实现的各个过程,为避免重复,这里不再赘述。
可选地,如图8所示,本申请实施例还提供一种电子设备800,包括处理器801,存储器802,存储在存储器802上并可在所述处理器801上运行的程序或指令,该程序或指令被处理器801执行时实现上述图像处理方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
需要说明的是,本申请实施例中的电子设备包括上述所述的移动电子设备和非移动电子设备。
图9为实现本申请实施例的一种电子设备900的硬件结构示意图。
该电子设备900包括但不限于:射频单元901、网络模块902、音频输出单元903、输入单元904、传感器905、显示单元906、用户输入单元907、接口单元908、存储器909、以及处理器910等部件。
本领域技术人员可以理解,电子设备900还可以包括给各个部件供电的电源(比如电池),电源可以通过电源管理系统与处理器910逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。图9中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置,在此不再赘述。
应理解的是,本申请实施例中,输入单元904可以包括图形处理器(Graphics Processing  Unit,GPU)9041和麦克风9042,图形处理器9041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。显示单元906可包括显示面板9061,可以采用液晶显示器、有机发光二极管等形式来配置显示面板9061。用户输入单元907包括触控面板9071以及其他输入设备9072。触控面板9071,也称为触摸屏。触控面板9071可包括触摸检测装置和触摸控制器两个部分。其他输入设备9072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。存储器909可用于存储软件程序以及各种数据,包括但不限于应用程序和操作系统。处理器910可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器910中。
本申请实施例还提供一种可读存储介质,所述可读存储介质上存储有程序或指令,该程序或指令被处理器执行时实现上述图像处理方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
其中,所述处理器为上述实施例中所述的电子设备中的处理器。所述可读存储介质,包括计算机可读存储介质,如计算机只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。
本申请实施例另提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现上述方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
应理解,本申请实施例提到的芯片还可以称为系统级芯片、系统芯片、芯片系统或片上系统芯片等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。此外,需要指出的是,本申请实施方式中的方法和装置的范围不限按示出或讨论的顺序来执行功能,还可包括根据所涉及的功能按基本同时的方式或按相反的顺序来执行功能,例如,可以按不同于所描述的次序来执行所描述的方法,并且还可以添加、省去、或组合各种步骤。另外,参照某些示例所描述的特征可在其他示例中被组合。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以计算机软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形 式,均属于本申请的保护之内。

Claims (15)

  1. 一种图像处理方法,应用于图像采集设备,所述图像采集设备包括第一摄像模组和第二摄像模组;所述方法包括:
    通过所述第一摄像模组获取目标图像的亮度信息,其中,所述目标图像为针对目标对象采集的图像;
    根据所述亮度信息,得到目标曝光参数,其中,所述目标曝光参数为使所述目标图像达到目标曝光程度所需的曝光参数;
    在所述第二摄像模组针对所述目标对象进行图像采集的情况下,将所述目标曝光参数设置为所述第二摄像模组进行曝光处理所需的曝光参数。
  2. 根据权利要求1所述的图像处理方法,其中,所述目标图像包括N个像素区,所述N个像素区中的每个像素区分别对应一个亮度值,所述亮度信息包括N个亮度值;
    所述通过所述第一摄像模组获取目标图像的亮度信息,包括:
    通过所述第一摄像模组,获取所述每个像素区分别对应的亮度值;
    所述根据所述亮度信息,得到目标曝光参数,包括:
    根据所述每个像素区分别对应的亮度值,得到所述每个像素区分别所需的目标曝光参数;
    其中,N为大于或等于2的整数。
  3. 根据权利要求2所述的图像处理方法,其中,所述将所述目标曝光参数设置为所述第二摄像模组进行曝光处理所需的曝光参数,包括:
    根据所述第二摄像模组采集的图像与所述目标图像的对应关系,采用所述N个像素区中的P个像素区分别所需的曝光参数的平均值,作为所述目标曝光参数,并将所述目标曝光参数设置为所述第二摄像模组进行曝光处理所需的曝光参数;
    其中,P为小于或等于N且大于或等于2的整数,所述P个像素区为所述N个像素区中落入所述第二摄像模组的视场范围的区域。
  4. 根据权利要求3所述的图像处理方法,其中,在所述第二摄像模组的视场范围移动的情况下,在所述根据所述第二摄像模组采集的图像与所述目标图像的对应关系,采用所述N个像素区中的P个像素区分别所需的曝光参数的平均值,作为所述目标曝光参数,并将所述目标曝光参数设置为所述第二摄像模组进行曝光处理所需的曝光参数之前,所述方法还包括:
    根据所述第二摄像模组的视场范围的移动趋势,对所述第二摄像模组采集的图像与所述目标图像的对应关系进行预测。
  5. 根据权利要求4所述的图像处理方法,其中,
    所述第一摄像模组的视场范围覆盖且大于所述第二摄像模组的视场范围;
    或者,所述第一摄像模组的视场范围与所述第二摄像模组的视场范围至少部分重叠。
  6. 一种图像处理装置,应用于图像采集设备,所述图像采集设备包括第一摄像模组和第二摄像模组;所述装置包括:
    获取模块,用于通过所述第一摄像模组获取目标图像的亮度信息,其中,所述目标图像为针对目标对象采集的图像;
    处理模块,用于根据所述获取模块获取的所述亮度信息,得到目标曝光参数,其中,所述目标曝光参数为使所述目标图像达到目标曝光程度所需的曝光参数;
    曝光模块,用于在所述第二摄像模组针对所述目标对象进行图像采集的情况下,将所述处理模块得到的所述目标曝光参数设置为所述第二摄像模组进行曝光处理所需的曝光参数。
  7. 根据权利要求6所述的图像处理装置,其中,所述目标图像包括N个像素区,所述N个像素区中的每个像素区分别对应一个亮度值,所述亮度信息包括N个亮度值;
    所述获取模块具体用于:
    通过所述第一摄像模组,获取所述每个像素区分别对应的亮度值;
    所述处理模块具体用于:
    根据所述每个像素区分别对应的亮度值,得到所述每个像素区分别所需的曝光参数;
    其中,N为大于或等于2的整数。
  8. 根据权利要求7所述的图像处理装置,其中,所述曝光模块具体用于:
    根据所述第二摄像模组采集的图像与所述目标图像的对应关系,采用所述N个像素区中的P个像素区分别所需的曝光参数的平均值,作为所述目标曝光参数,并将所述目标曝光参数设置为所述第二摄像模组进行曝光处理所需的曝光参数;
    其中,P为小于或等于N且大于或等于2的整数,所述P个像素区为所述N个像素区中落入所述第二摄像模组的视场范围的区域。
  9. 根据权利要求8所述的图像处理装置,其中,在所述第二摄像模组的视场范围移动的情况下,所述装置还包括:
    预测模块,用于在所述根据所述第二摄像模组采集的图像与所述目标图像的对应关系,采用所述N个像素区中的P个像素区分别所需的曝光参数的平均值,作为所述目标曝光参数,并将所述目标曝光参数设置为所述第二摄像模组进行曝光处理所需的曝光参数之前,根据所述第二摄像模组的视场范围的移动趋势,对所述第二摄像模组 采集的图像与所述目标图像的对应关系进行预测。
  10. 根据权利要求9所述的图像处理装置,其中,
    所述第一摄像模组的视场范围覆盖且大于所述第二摄像模组的视场范围;
    或者,所述第一摄像模组的视场范围与所述第二摄像模组的视场范围至少部分重叠。
  11. 一种电子设备,包括处理器,存储器及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如权利要求1至5中任一项所述的图像处理方法的步骤。
  12. 一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如权利要求1至5中任一项所述的图像处理方法的步骤。
  13. 一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如权利要求1至5中任一项所述的图像处理方法。
  14. 一种计算机程序产品,所述程序产品被至少一个处理器执行以实现如权利要求1至5中任一项所述的图像处理方法。
  15. 一种电子设备,包括所述电子设备被配置成用于执行如权利要求1至5中任一项所述的图像处理方法。
PCT/CN2022/100841 2021-06-24 2022-06-23 图像处理方法、装置及电子设备 WO2022268170A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110704355.9A CN113572970A (zh) 2021-06-24 2021-06-24 图像处理方法、装置及电子设备
CN202110704355.9 2021-06-24

Publications (1)

Publication Number Publication Date
WO2022268170A1 true WO2022268170A1 (zh) 2022-12-29

Family

ID=78162604

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/100841 WO2022268170A1 (zh) 2021-06-24 2022-06-23 图像处理方法、装置及电子设备

Country Status (2)

Country Link
CN (1) CN113572970A (zh)
WO (1) WO2022268170A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113572970A (zh) * 2021-06-24 2021-10-29 维沃移动通信(杭州)有限公司 图像处理方法、装置及电子设备
CN114125314B (zh) * 2021-11-23 2024-08-23 展讯通信(上海)有限公司 亮度同步方法、装置、存储介质及设备
CN114143430A (zh) * 2021-11-30 2022-03-04 维沃移动通信有限公司 图像传感器、摄像模组、电子设备及图像获取方法
CN113973181A (zh) * 2021-11-30 2022-01-25 维沃移动通信有限公司 图像传感器、摄像模组和电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009060289A (ja) * 2007-08-30 2009-03-19 Honda Motor Co Ltd カメラの露出制御装置
CN106878625A (zh) * 2017-04-19 2017-06-20 宇龙计算机通信科技(深圳)有限公司 双摄像头同步曝光方法及系统
CN108337446A (zh) * 2018-04-12 2018-07-27 Oppo广东移动通信有限公司 基于双摄像头的高动态范围图像获取方法、装置及设备
CN109413336A (zh) * 2018-12-27 2019-03-01 北京旷视科技有限公司 拍摄方法、装置、电子设备及计算机可读存储介质
CN113572970A (zh) * 2021-06-24 2021-10-29 维沃移动通信(杭州)有限公司 图像处理方法、装置及电子设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112004029B (zh) * 2019-05-27 2022-03-15 Oppo广东移动通信有限公司 曝光处理方法、装置、电子设备、计算机可读存储介质
CN110225248B (zh) * 2019-05-29 2021-11-16 Oppo广东移动通信有限公司 图像采集方法和装置、电子设备、计算机可读存储介质
CN110213494B (zh) * 2019-07-03 2021-05-11 Oppo广东移动通信有限公司 拍摄方法和装置、电子设备、计算机可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009060289A (ja) * 2007-08-30 2009-03-19 Honda Motor Co Ltd カメラの露出制御装置
CN106878625A (zh) * 2017-04-19 2017-06-20 宇龙计算机通信科技(深圳)有限公司 双摄像头同步曝光方法及系统
CN108337446A (zh) * 2018-04-12 2018-07-27 Oppo广东移动通信有限公司 基于双摄像头的高动态范围图像获取方法、装置及设备
CN109413336A (zh) * 2018-12-27 2019-03-01 北京旷视科技有限公司 拍摄方法、装置、电子设备及计算机可读存储介质
CN113572970A (zh) * 2021-06-24 2021-10-29 维沃移动通信(杭州)有限公司 图像处理方法、装置及电子设备

Also Published As

Publication number Publication date
CN113572970A (zh) 2021-10-29

Similar Documents

Publication Publication Date Title
WO2022268170A1 (zh) 图像处理方法、装置及电子设备
WO2022268216A1 (zh) 像素结构、图像传感器、控制方法及装置、电子设备
US10609300B2 (en) Image sensor, operation method thereof, and imaging device
JP5937767B2 (ja) 撮像装置及び撮像方法
JP5646421B2 (ja) 固体撮像装置および固体撮像システム
JP5946970B2 (ja) 撮像装置及び撮像方法
EP2587407B1 (en) Vision recognition apparatus and method
JP2018530250A (ja) 高ダイナミックレンジ固体画像センサおよびカメラシステム
US8253810B2 (en) Method, apparatus and system for image stabilization using a single pixel array
EP2903258A1 (en) Image-processing device and method, and image pickup device
JP2007116208A (ja) 複眼撮像装置
US11496666B2 (en) Imaging apparatus with phase difference detecting element
KR20210114290A (ko) 이미지 센서 및 이를 포함하는 촬영 장치
WO2022268132A1 (zh) 图像处理方法、装置及电子设备
JP6998454B2 (ja) 撮像装置、撮像方法、プログラム及び記録媒体
CN113301262B (zh) 像素处理电路、方法、装置及电子设备
CN113286092B (zh) 像素处理电路、方法、装置及电子设备
CN113301261B (zh) 像素处理电路、方法、装置及电子设备
CN113973181A (zh) 图像传感器、摄像模组和电子设备
JP6207360B2 (ja) 撮像装置及び画像信号処理方法
JP2022064801A (ja) 画像センサー及び電子デバイス
CN115589515A (zh) 摄像头模组、电子设备及图像采集方法
CN115118856A (zh) 图像传感器、图像处理方法、摄像头模组及电子设备
CN118474504A (zh) 像素电路、图像传感器、拍摄方法、装置、设备及介质
KR101090969B1 (ko) 알람 기능을 포함하는 이미지 센싱 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22827662

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22827662

Country of ref document: EP

Kind code of ref document: A1