CN111917995B - Image processing method, image processing device, electronic equipment and storage medium - Google Patents
Image processing method, image processing device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN111917995B CN111917995B CN202010616011.8A CN202010616011A CN111917995B CN 111917995 B CN111917995 B CN 111917995B CN 202010616011 A CN202010616011 A CN 202010616011A CN 111917995 B CN111917995 B CN 111917995B
- Authority
- CN
- China
- Prior art keywords
- exposure
- image
- sensitivity
- acquiring
- noise reduction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012545 processing Methods 0.000 title claims abstract description 30
- 238000003672 processing method Methods 0.000 title claims abstract description 24
- 230000035945 sensitivity Effects 0.000 claims abstract description 70
- 230000000875 corresponding effect Effects 0.000 claims description 81
- 230000009467 reduction Effects 0.000 claims description 51
- 238000012549 training Methods 0.000 claims description 18
- 238000000034 method Methods 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 8
- 230000002596 correlated effect Effects 0.000 claims description 4
- 238000007499 fusion processing Methods 0.000 claims description 3
- 206010034960 Photophobia Diseases 0.000 claims description 2
- 230000003247 decreasing effect Effects 0.000 claims description 2
- 208000013469 light sensitivity Diseases 0.000 claims description 2
- 230000004927 fusion Effects 0.000 abstract description 7
- 206010034972 Photosensitivity reaction Diseases 0.000 abstract 1
- 230000036211 photosensitivity Effects 0.000 abstract 1
- 230000006870 function Effects 0.000 description 9
- 230000002829 reductive effect Effects 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000036961 partial effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
-
- G06T5/92—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Abstract
The application provides an image processing method, an image processing device, electronic equipment and a storage medium. The image processing method comprises the following steps: acquiring a plurality of first exposure parameters with different exposure compensation values, wherein the first exposure parameters comprise exposure time and sensitivity; lengthening the exposure time of each first exposure parameter by a preset amplitude and correspondingly reducing the sensitivity to generate a corresponding second exposure parameter; acquiring a corresponding exposure image according to each second exposure parameter; processing a plurality of the exposure images to generate corresponding high dynamic range images. This application is through adjusting a plurality of first exposure parameters that have different exposure offset values, realizes high exposure time, and the corresponding exposure image is obtained to the strategy of low photosensitivity to form high dynamic range image through this many exposure image fusions, can improve the dynamic range of image, avoid appearing the image inhomogeneous, can improve image quality.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
As a large-screen intelligent device, the intelligent screen has the functions of audio and video, entertainment and education besides the basic television function. After the camera is added, the functions of the camera are further expanded, and the camera can be used as a family Internet of things interaction center. However, the photographing function of the smart screen is generally common single-frame photographing, and the photographing function of the smart screen has great defects. On one hand, compared with a smart phone, an image signal processing unit of the smart screen has some differences; moreover, because the intelligent screen is mostly used indoors, the ambient light condition is not ideal, and the image noise is large. On the other hand, due to indoor lighting, it may happen that the camera is in a backlight state for taking a picture, which results in overall darkness, or that the lighting environment is not uniform, which results in overexposure of partial contents of the picture.
In view of the above problems, no effective technical solution exists at present.
Disclosure of Invention
An object of the embodiments of the present application is to provide an image processing method, an image processing apparatus, an electronic device, and a storage medium, which can avoid uneven exposure of an image and improve image quality.
In a first aspect, an embodiment of the present application provides an image processing method, including:
acquiring a plurality of first exposure parameters with different exposure compensation values, wherein the first exposure parameters comprise exposure time and sensitivity;
lengthening the exposure time of each first exposure parameter by a preset amplitude and correspondingly reducing the sensitivity to generate a corresponding second exposure parameter;
controlling a camera to obtain a corresponding exposure image according to each second exposure parameter;
processing a plurality of the exposure images to generate corresponding high dynamic range images.
According to the embodiment of the application, the plurality of first exposure parameters with different exposure compensation values are adjusted, so that the high exposure time is realized, and the corresponding exposure image is obtained by the low-sensitivity strategy, so that the high-dynamic-range image is formed by fusing the plurality of exposure images, the dynamic range of the image can be improved, the image is prevented from being uneven, and the image quality can be improved.
Optionally, in the image processing method according to the embodiment of the present application, the acquiring a plurality of first exposure parameters with different exposure compensation values includes:
acquiring a current environment brightness value;
and acquiring a plurality of first exposure parameters with different exposure compensation values in corresponding quantity according to the environment brightness value, wherein the quantity of the first exposure parameters is inversely related to the environment brightness value.
According to the image fusion method and device, the number of the first exposure parameters is adjusted based on the environment brightness value, the number of the subsequent exposure images participating in the image fusion is controlled, and the dynamic range of the fused image can be improved while the efficiency is guaranteed.
Optionally, in the image processing method according to the embodiment of the present application, before the lengthening the exposure time of each of the first exposure parameters by a preset amplitude and the correspondingly decreasing the sensitivity by a preset amplitude to generate a corresponding second exposure parameter, the method includes:
acquiring a current environment brightness value;
and acquiring a corresponding preset amplitude according to the environment brightness value, wherein the preset amplitude is negatively correlated with the environment brightness value.
According to the embodiment of the application, the lengthening amplitude of the exposure time and the reduction amplitude of the light sensitivity are adjusted based on the environment brightness value, so that the subsequently generated exposure image can be ensured to have enough exposure time on the premise that the requirement that the noise of the exposure image is low is met.
Optionally, in the image processing method according to the embodiment of the present application, the processing the plurality of exposure images to generate corresponding high dynamic range images includes:
carrying out noise reduction processing on each exposure image by adopting a target noise reduction model to obtain a corresponding low-noise exposure image;
and fusing the low-noise exposure images to generate a corresponding high-dynamic-range image.
Optionally, in the image processing method according to the embodiment of the present application, before performing noise reduction processing on each of the exposure images by using a target noise reduction model to obtain a corresponding low-noise exposure image, the method further includes:
and acquiring a corresponding target noise reduction model from a plurality of noise reduction models according to the sensitivity adopted in the generation of each exposure image.
Optionally, in the image processing method according to an embodiment of the present application, the method further includes:
acquiring a plurality of sample sets, wherein each sample set comprises a plurality of sample data, each sample data comprises sensitivity, an environment brightness value, exposure time and a corresponding exposure image, sample data of different sample sets have different sensitivities, and sample data of the same sample set has the same sensitivity;
and training the initial noise reduction model according to a plurality of sample data in each sample set respectively to generate a corresponding noise reduction model, and obtaining one noise reduction model by correspondingly training each sample set.
Optionally, in the image processing method according to this embodiment of the present application, the plurality of first exposure parameters with different exposure compensation values include a first exposure parameter with a compensation value of 0, at least one first exposure parameter with a positive compensation value, and at least one first exposure parameter with a negative compensation value.
In a second aspect, an embodiment of the present application further provides an image processing apparatus, including:
the system comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring a plurality of first exposure parameters with different exposure compensation values, and the first exposure parameters comprise exposure time and sensitivity;
the first generation module is used for lengthening the exposure time of each first exposure parameter by a preset amplitude and reducing the sensitivity by a corresponding amplitude so as to generate a corresponding second exposure parameter;
the second acquisition module is used for controlling the camera to acquire a corresponding exposure image according to each second exposure parameter;
and the second generation module is used for processing the exposure images to generate corresponding high dynamic range images.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the memory stores computer-readable instructions, and when the computer-readable instructions are executed by the processor, the steps in the method as provided in the first aspect are executed.
In a fourth aspect, embodiments of the present application provide a storage medium, on which a computer program is stored, where the computer program, when executed by a processor, performs the steps in the method as provided in the first aspect.
As can be seen from the above, the image processing method, the image processing apparatus, the electronic device, and the storage medium provided in the embodiments of the present application acquire a plurality of first exposure parameters having different exposure compensation values, where the first exposure parameters include exposure time and sensitivity; lengthening the exposure time of each first exposure parameter by a preset amplitude and correspondingly reducing the sensitivity to generate a corresponding second exposure parameter; controlling a camera to obtain a corresponding exposure image according to each second exposure parameter; processing the plurality of exposure images to generate corresponding high dynamic range images; the first exposure parameters with different exposure compensation values are adjusted to achieve high exposure time and low sensitivity, so that the corresponding exposure image is obtained through the multiple exposure images, a high dynamic range image is formed through fusion, the dynamic range of the image can be improved, the phenomenon that the image is uneven is avoided, and the image quality can be improved.
Additional features and advantages of the present application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the embodiments of the present application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present application.
Fig. 2 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, fig. 1 is a flowchart of an image processing method according to some embodiments of the present disclosure. The image processing method can be applied to electronic devices with cameras, such as mobile phones, tablet computers, smart screens and the like, and is particularly suitable for smart screens, which are described below as examples. The image processing method comprises the following steps:
s101, acquiring a plurality of first exposure parameters with different exposure compensation values, wherein the first exposure parameters comprise exposure time and sensitivity.
And S102, lengthening the exposure time of each first exposure parameter by a preset amplitude, and reducing the sensitivity by a corresponding amplitude to generate a corresponding second exposure parameter.
S103, acquiring a corresponding exposure image according to each second exposure parameter.
And S104, processing the plurality of exposure images to generate corresponding high dynamic range images.
In step S101, each first exposure parameter EV is S × G, S is an exposure time, and G is a sensitivity. The plurality of first exposure parameters may be pre-configured or may be dynamically set. Of course, the exposure compensation value of one of the plurality of first exposure parameters is 0, the exposure compensation value of at least one of the first exposure parameters is positive, and the exposure compensation value of at least one of the first exposure parameters is negative. For example, the first exposure parameters are EV0, EV +2, and EV-2, respectively. Wherein, the exposure compensation values of the three first exposure parameters are 0, 2 and-2 respectively.
In some embodiments, this step S101 includes the following sub-steps: s1011, acquiring a current environment brightness value; s1012, acquiring a plurality of first exposure parameters with different exposure compensation values in corresponding quantity according to the environment brightness value, wherein the quantity of the first exposure parameters is in negative correlation with the environment brightness value. For example, if the ambient brightness value is less than a1, the ambient is dark, so five first exposure parameters are set, and correspondingly, five exposure maps are generated subsequently. The five first exposure parameters are EV0, EV +1, EV +2, EV +1 and EV-2 respectively. If the ambient brightness value is greater than a1 and less than or equal to a2, four first exposure parameters are generated: EV0, EV-1, EV + 2. If the ambient brightness value is greater than a2, three first exposure parameters EV0, EV-1, EV +1 are set. Of course, it is only an example, and the values of a1, a2 and a3 can be obtained through a plurality of experiments according to actual situations.
In step S102, for example, for a first exposure parameter EV1, S1 × G1, where S1 and G1 are the exposure time and the sensitivity of the first exposure parameter before adjustment, respectively. The exposure time is increased by an adjustment coefficient x, the exposure time after adjustment is S2 — xS1, and the sensitivity is adjusted to G2 — G1/x, so that a second exposure parameter E2 — S2 — G2 is obtained, where the first exposure parameter and the second exposure parameter have the same value, that is, S1G1 — S2G 2. However, the exposure time of the second exposure parameter obtained after the adjustment is lengthened, and the sensitivity is reduced. Of course, it is understood that the sensitivity and exposure time cannot be adjusted beyond the performance range of the image sensor of the camera in the specific adjustment of the exposure parameters. Moreover, because the smart screen is a fixed device, the device has extremely small jitter; therefore, the exposure time can be increased and the sensitivity can be reduced in the exposure strategy, so that the output noise of the image sensor is less, and the photographing effect can be improved.
It is understood that, in some embodiments, before the step S102, the following steps are further included: acquiring a current environment brightness value; and acquiring a corresponding preset amplitude according to the environment brightness value, wherein the preset amplitude is negatively correlated with the environment brightness value. For example, when the ambient brightness value is in the [ b1, b2) interval, the adjustment coefficient is x 1; when the environment brightness value is in the [ b2, b3) interval, the adjustment coefficient is x 2; when the environment brightness value is in the [ b3, b4) interval, the adjustment coefficient is x 3; when the ambient brightness value is in the [ b4, b5) interval, the adjustment coefficient is x 4. Of course, it is not limited thereto.
Of course, it can be understood that, in some embodiments, the preset amplitude is a preset fixed value, that is, the adjustment coefficient x is a fixed value, and after a plurality of first exposure parameters are obtained, the adjustment coefficient x is used to lengthen the exposure time of each first exposure parameter, so as to reduce the sensitivity, thereby reducing the interference of noise on the subsequently formed exposure image.
In step S103, a plurality of second exposure parameters generated according to the plurality of first exposure parameters are sent to the camera, so that the camera acquires a corresponding exposure image according to the new second exposure parameters, where one exposure image is acquired corresponding to each second exposure parameter.
In step S104, a HDR algorithm in the prior art may be adopted to fuse the acquired multiple exposure images to obtain a final high dynamic range image, and of course, the exposure images may be subjected to noise reduction processing before being fused, and then fused.
Specifically, in some embodiments, this step S104 includes: s1041, performing noise reduction processing on each exposure image by adopting a target noise reduction model to obtain a corresponding low-noise exposure image; and S1042, performing fusion processing on the low-noise exposure images to generate corresponding high-dynamic-range images. When the intelligent screen is used for photographing, due to indoor light, the situation that the camera is in the backlight easily occurs, a main body is possibly dark, or partial picture content is overexposed due to the fact that the lighting environment is complex/uneven is caused, and accordingly a plurality of low-noise exposure images are fused by means of the HDR algorithm, and the dynamic range of the photo can be improved.
The target noise reduction model is obtained by pre-training and does not need to be described too much in the prior art. Of course, in some embodiments, a plurality of noise reduction models are trained in advance for different sensitivities, and therefore, before performing step S1041, a corresponding target noise reduction model is obtained from the plurality of noise reduction models according to the sensitivity adopted in each of the exposure image generation. When adopting the wisdom screen to shoot, because the wisdom screen is in indoor environment, the probability that the wisdom screen was used night in addition is higher. In both cases, the problem of insufficient brightness exists, the noise of the picture is excessive finally, and the noise can be further reduced by adopting a scheme of multi-frame noise reduction based on deep learning.
Therefore, in some embodiments, before performing step S104, a step of training a noise reduction model is further included, and the step of training the noise reduction model includes: s410, acquiring a plurality of sample sets, wherein each sample set comprises a plurality of sample data, each sample data comprises sensitivity, an environment brightness value, exposure time and a corresponding exposure image, the sample data of different sample sets have different sensitivities, and the sample data of the same sample set has the same sensitivity; and S420, training the initial noise reduction model according to a plurality of sample data in each sample set respectively to generate corresponding noise reduction models, and obtaining one noise reduction model by training each sample set correspondingly.
In step S410, the camera is set in a lamp box in a darkroom, the exposure time and the sensitivity of the camera are set, an original exposure image is obtained by shooting, and the original exposure image, the brightness of the lamp box, the exposure time and the sensitivity at this time are recorded; using the data as a sample data; and then repeating the operation, respectively adjusting the lamp box brightness, the exposure time and the sensitivity to obtain a plurality of original exposure images, and recording the lamp box brightness, the exposure time and the sensitivity corresponding to each original exposure image. Thereby obtaining a plurality of sample data. And then grouping the plurality of sample data by taking the sensitivity as a reference standard, grouping sample data with the same sensitivity into a group, wherein the sample data of the same group form a sample set, and the plurality of sensitivities correspond to the plurality of sample sets.
In step S420, noise extraction is performed on each original exposure image to obtain noise information of each original exposure image, and then the noise information is superimposed on the corresponding original exposure image to obtain a low-noise exposure image. And finally, sequentially inputting all sample data in each sample set into the initial noise reduction model for training until a corresponding noise reduction model is obtained. During training, the cost function of the corresponding noise reduction model can be calculated through the low-noise image, and when the cost function meets a preset error, the completion of the training is indicated.
As can be seen from the above, in the image processing method provided in the embodiment of the present application, a plurality of first exposure parameters with different exposure compensation values are obtained, where the first exposure parameters include exposure time and sensitivity; lengthening the exposure time of each first exposure parameter by a preset amplitude and correspondingly reducing the sensitivity to generate a corresponding second exposure parameter; controlling a camera to obtain a corresponding exposure image according to each second exposure parameter; processing the plurality of exposure images to generate corresponding high dynamic range images; the first exposure parameters with different exposure compensation values are adjusted to achieve high exposure time and low sensitivity, so that the corresponding exposure image is obtained through the multiple exposure images, a high dynamic range image is formed through fusion, the dynamic range of the image can be improved, the phenomenon that the image is uneven is avoided, and the image quality can be improved.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an image processing apparatus according to some embodiments of the present disclosure. The image processing apparatus is integrated in the form of a computer program in an electronic device having a camera. The image processing apparatus includes: a first obtaining module 201, a first generating module 202, a second obtaining module 203 and a second generating module 204.
The first obtaining module 201 is configured to obtain a plurality of first exposure parameters with different exposure compensation values, where the first exposure parameters include exposure time and sensitivity. Each first exposure parameter EV is S × G, S is an exposure time, and G is a sensitivity. The plurality of first exposure parameters may be pre-configured or may be dynamically set. Of course, the exposure compensation value of one of the plurality of first exposure parameters is 0, the exposure compensation value of at least one of the first exposure parameters is positive, and the exposure compensation value of at least one of the first exposure parameters is negative. For example, the first exposure parameters are EV0, EV +2, and EV-2, respectively. Wherein, the exposure compensation values of the three first exposure parameters are 0, 2 and-2 respectively.
In some embodiments, the first obtaining module 201 is specifically configured to obtain a current environment brightness value; and acquiring a plurality of first exposure parameters with different exposure compensation values in corresponding quantity according to the environment brightness value, wherein the quantity of the first exposure parameters is inversely related to the environment brightness value. For example, if the ambient brightness value is less than a1, the ambient is dark, so five first exposure parameters are set, and correspondingly, five exposure maps are generated subsequently. The five first exposure parameters are EV0, EV +1, EV +2, EV +1 and EV-2 respectively. If the ambient brightness value is greater than a1 and less than or equal to a2, four first exposure parameters are generated: EV0, EV-1, EV + 2. If the ambient brightness value is greater than a2, three first exposure parameters EV0, EV-1, EV +1 are set. Of course, it is only an example, and the values of a1, a2 and a3 can be obtained through a plurality of experiments according to actual situations.
The first generating module 202 is configured to lengthen the exposure time of each first exposure parameter by a preset amplitude and reduce the sensitivity by a corresponding amplitude, so as to generate a corresponding second exposure parameter. For example, for a first exposure parameter EV1, S1 × G1, where S1 and G1 are the exposure time and sensitivity of the first exposure parameter before adjustment, respectively. The exposure time is increased by an adjustment coefficient x, the exposure time after adjustment is S2 — xS1, and the sensitivity is adjusted to G2 — G1/x, so that a second exposure parameter E2 — S2 — G2 is obtained, where the first exposure parameter and the second exposure parameter have the same value, that is, S1G1 — S2G 2. However, the exposure time of the second exposure parameter obtained after the adjustment is lengthened, and the sensitivity is reduced. Of course, it is understood that the sensitivity and exposure time cannot be adjusted beyond the performance range of the image sensor of the camera in the specific adjustment of the exposure parameters. Moreover, because the smart screen is a fixed device, the device has extremely small jitter; therefore, the exposure time can be increased and the sensitivity can be reduced in the exposure strategy, so that the output noise of the image sensor is less, and the photographing effect can be improved.
It is to be appreciated that in some embodiments, the first generation module 202 is further configured to: acquiring a current environment brightness value; and acquiring a corresponding preset amplitude according to the environment brightness value, wherein the preset amplitude is negatively correlated with the environment brightness value. For example, when the ambient brightness value is in the [ b1, b2) interval, the adjustment coefficient is x 1; when the environment brightness value is in the [ b2, b3) interval, the adjustment coefficient is x 2; when the environment brightness value is in the [ b3, b4) interval, the adjustment coefficient is x 3; when the ambient brightness value is in the [ b4, b5) interval, the adjustment coefficient is x 4. Of course, it is not limited thereto.
Of course, it can be understood that, in some embodiments, the preset amplitude is a preset fixed value, that is, the adjustment coefficient x is a fixed value, and after a plurality of first exposure parameters are obtained, the adjustment coefficient x is used to lengthen the exposure time of each first exposure parameter, so as to reduce the sensitivity, thereby reducing the interference of noise on the subsequently formed exposure image.
The second obtaining module 203 is configured to control the camera to obtain a corresponding exposure image according to each of the second exposure parameters. And sending a plurality of second exposure parameters generated according to the plurality of first exposure parameters to the camera so that the camera acquires a corresponding exposure image according to the new second exposure parameters, wherein one exposure image is acquired corresponding to each second exposure parameter.
The second generating module 204 is configured to process the exposure images to generate corresponding high dynamic range images. The second generating module 204 may adopt an HDR algorithm in the prior art to fuse the acquired multiple exposure images to obtain a final high dynamic range image, and of course, it may also perform noise reduction on the exposure images before the fusion, and then perform the fusion.
Specifically, in some embodiments, the second generating module 204 is specifically configured to: carrying out noise reduction processing on each exposure image by adopting a target noise reduction model to obtain a corresponding low-noise exposure image; and fusing the low-noise exposure images to generate a corresponding high-dynamic-range image. The target noise reduction model is obtained by pre-training and does not need to be described too much in the prior art. When the intelligent screen is used for photographing, due to indoor light, the situation that the camera is in the backlight easily occurs, a main body is possibly dark, or partial picture content is overexposed due to the fact that the lighting environment is complex/uneven is caused, and accordingly a plurality of low-noise exposure images are fused by means of the HDR algorithm, and the dynamic range of the photo can be improved.
Of course, in some embodiments, a plurality of noise reduction models are trained in advance for different sensitivities, and therefore, the second generating module 204 is further configured to obtain a corresponding target noise reduction model from the plurality of noise reduction models according to the sensitivity adopted in generating each exposure image. When adopting the wisdom screen to shoot, because the wisdom screen is in indoor environment, the probability that the wisdom screen was used night in addition is higher. In both cases, the problem of insufficient brightness exists, the noise of the picture is excessive finally, and the noise can be further reduced by adopting a scheme of multi-frame noise reduction based on deep learning.
Therefore, in some embodiments, a plurality of noise reduction models need to be trained in advance. When the noise reduction module is trained: firstly, acquiring a plurality of sample sets, wherein each sample set comprises a plurality of sample data, each sample data comprises sensitivity, an environment brightness value, exposure time and a corresponding exposure image, the sample data of different sample sets has different sensitivity, and the sample data of the same sample set has the same sensitivity; and then training the initial noise reduction model according to a plurality of sample data in each sample set respectively to generate a corresponding noise reduction model, and obtaining one noise reduction model by correspondingly training each sample set.
The camera is arranged in a lamp box in a darkroom, the exposure time and the sensitivity of the camera are set, then an original exposure image is obtained through shooting, and the original exposure image, the lamp box brightness, the exposure time and the sensitivity at the moment are recorded; using the data as a sample data; and then repeating the operation, respectively adjusting the lamp box brightness, the exposure time and the sensitivity to obtain a plurality of original exposure images, and recording the lamp box brightness, the exposure time and the sensitivity corresponding to each original exposure image. Thereby obtaining a plurality of sample data. And then grouping the plurality of sample data by taking the sensitivity as a reference standard, grouping sample data with the same sensitivity into a group, wherein the sample data of the same group form a sample set, and the plurality of sensitivities correspond to the plurality of sample sets. And then, carrying out noise extraction on each original exposure image to obtain noise information of each original exposure image, and then superposing the noise information to the corresponding original exposure image to obtain a low-noise exposure image. And finally, sequentially inputting all sample data in each sample set into the initial noise reduction model for training until a corresponding noise reduction model is obtained. During training, the cost function of the corresponding noise reduction model can be calculated through the low-noise image, and when the cost function meets a preset error, the completion of the training is indicated.
As can be seen from the above, the image processing apparatus provided in the embodiment of the present application obtains a plurality of first exposure parameters with different exposure compensation values, where the first exposure parameters include an exposure time and a sensitivity; lengthening the exposure time of each first exposure parameter by a preset amplitude and correspondingly reducing the sensitivity to generate a corresponding second exposure parameter; controlling a camera to obtain a corresponding exposure image according to each second exposure parameter; processing the plurality of exposure images to generate corresponding high dynamic range images; the first exposure parameters with different exposure compensation values are adjusted to achieve high exposure time and low sensitivity, so that the corresponding exposure image is obtained through the multiple exposure images, a high dynamic range image is formed through fusion, the dynamic range of the image can be improved, the phenomenon that the image is uneven is avoided, and the image quality can be improved.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, where the present disclosure provides an electronic device 3, including: the processor 301 and the memory 302, the processor 301 and the memory 302 being interconnected and communicating with each other via a communication bus 303 and/or other form of connection mechanism (not shown), the memory 302 storing a computer program executable by the processor 301, the processor 301 executing the computer program when the computing device is running to perform the method of any of the alternative implementations of the embodiments described above.
The embodiment of the present application provides a storage medium, and when being executed by a processor, the computer program performs the method in any optional implementation manner of the above embodiment. The storage medium may be implemented by any type of volatile or nonvolatile storage device or combination thereof, such as a Static Random Access Memory (SRAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an Erasable Programmable Read-Only Memory (EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (10)
1. An image processing method, comprising:
acquiring a plurality of first exposure parameters with different exposure compensation values, wherein the first exposure parameters comprise exposure time and sensitivity;
lengthening the exposure time of each first exposure parameter by a preset amplitude and reducing the preset amplitude of the light sensitivity to generate a corresponding second exposure parameter;
acquiring a corresponding exposure image according to each second exposure parameter;
and performing fusion processing on the plurality of exposure images to generate corresponding high dynamic range images.
2. The method according to claim 1, wherein said obtaining a plurality of first exposure parameters having different exposure compensation values comprises:
acquiring a current environment brightness value;
and acquiring a plurality of first exposure parameters with different exposure compensation values in corresponding quantity according to the environment brightness value, wherein the quantity of the first exposure parameters is inversely related to the environment brightness value.
3. The method of claim 1, wherein before increasing the exposure time of each of the first exposure parameters by a predetermined magnitude and decreasing the sensitivity by a corresponding magnitude to generate the corresponding second exposure parameter, the method comprises:
acquiring a current environment brightness value;
and acquiring a corresponding preset amplitude according to the environment brightness value, wherein the preset amplitude is negatively correlated with the environment brightness value.
4. The image processing method of claim 1, wherein said processing a plurality of said exposure images to generate corresponding high dynamic range images comprises:
carrying out noise reduction processing on each exposure image by adopting a target noise reduction model to obtain a corresponding low-noise exposure image;
and fusing the low-noise exposure images to generate a corresponding high-dynamic-range image.
5. The image processing method according to claim 4, wherein before performing noise reduction processing on each of the exposed images by using the target noise reduction model to obtain the corresponding low-noise exposed image, the method further comprises:
and acquiring a corresponding target noise reduction model from a plurality of noise reduction models according to the sensitivity adopted in the generation of each exposure image.
6. The image processing method according to claim 5, characterized in that the method further comprises:
acquiring a plurality of sample sets, wherein each sample set comprises a plurality of sample data, and each sample data comprises sensitivity, an environment brightness value, exposure time and a corresponding exposure image, wherein the sample data of different sample sets has different sensitivity, and the sample data of the same sample set has the same sensitivity;
and training the initial noise reduction model according to a plurality of sample data in each sample set respectively to generate a corresponding noise reduction model, and obtaining one noise reduction model by correspondingly training each sample set.
7. The image processing method according to claim 1, wherein the plurality of first exposure parameters having different exposure compensation values include one first exposure parameter having a compensation value of 0, at least one first exposure parameter having a positive compensation value, and at least one first exposure parameter having a negative compensation value.
8. An image processing apparatus characterized by comprising:
the system comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring a plurality of first exposure parameters with different exposure compensation values, and the first exposure parameters comprise exposure time and sensitivity;
the first generation module is used for lengthening the exposure time of each first exposure parameter by a preset amplitude and reducing the preset amplitude of the sensitivity so as to generate a corresponding second exposure parameter;
the second acquisition module is used for acquiring a corresponding exposure image according to each second exposure parameter;
and the second generation module is used for carrying out fusion processing on the plurality of exposure images so as to generate corresponding high dynamic range images.
9. An electronic device comprising a processor and a memory, the memory storing computer readable instructions that, when executed by the processor, perform the method of any one of claims 1-7.
10. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, performs the method according to any of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010616011.8A CN111917995B (en) | 2020-06-29 | 2020-06-29 | Image processing method, image processing device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010616011.8A CN111917995B (en) | 2020-06-29 | 2020-06-29 | Image processing method, image processing device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111917995A CN111917995A (en) | 2020-11-10 |
CN111917995B true CN111917995B (en) | 2022-02-08 |
Family
ID=73227071
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010616011.8A Active CN111917995B (en) | 2020-06-29 | 2020-06-29 | Image processing method, image processing device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111917995B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008206021A (en) * | 2007-02-22 | 2008-09-04 | Matsushita Electric Ind Co Ltd | Imaging device and lens barrel |
CN103051841A (en) * | 2013-01-05 | 2013-04-17 | 北京小米科技有限责任公司 | Method and device for controlling exposure time |
CN105657243A (en) * | 2015-11-08 | 2016-06-08 | 乐视移动智能信息技术(北京)有限公司 | Anti-jitter delay photographing method and device |
CN106060249A (en) * | 2016-05-19 | 2016-10-26 | 维沃移动通信有限公司 | Shooting anti-shaking method and mobile terminal |
CN106657806A (en) * | 2017-01-24 | 2017-05-10 | 维沃移动通信有限公司 | Exposure method and mobile terminal |
-
2020
- 2020-06-29 CN CN202010616011.8A patent/CN111917995B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008206021A (en) * | 2007-02-22 | 2008-09-04 | Matsushita Electric Ind Co Ltd | Imaging device and lens barrel |
CN103051841A (en) * | 2013-01-05 | 2013-04-17 | 北京小米科技有限责任公司 | Method and device for controlling exposure time |
CN105657243A (en) * | 2015-11-08 | 2016-06-08 | 乐视移动智能信息技术(北京)有限公司 | Anti-jitter delay photographing method and device |
CN106060249A (en) * | 2016-05-19 | 2016-10-26 | 维沃移动通信有限公司 | Shooting anti-shaking method and mobile terminal |
CN106657806A (en) * | 2017-01-24 | 2017-05-10 | 维沃移动通信有限公司 | Exposure method and mobile terminal |
Also Published As
Publication number | Publication date |
---|---|
CN111917995A (en) | 2020-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109218628B (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN109005366B (en) | Night scene shooting processing method and device for camera module, electronic equipment and storage medium | |
CN109348089B (en) | Night scene image processing method and device, electronic equipment and storage medium | |
CN101997981B (en) | Mobile phone camera-based latitude realization method and mobile phone | |
CN106060249B (en) | Photographing anti-shake method and mobile terminal | |
CN109218627B (en) | Image processing method, image processing device, electronic equipment and storage medium | |
US11532076B2 (en) | Image processing method, electronic device and storage medium | |
KR102149187B1 (en) | Electronic device and control method of the same | |
CN109729274B (en) | Image processing method, image processing device, electronic equipment and storage medium | |
US20170163878A1 (en) | Method and electronic device for adjusting shooting parameters of camera | |
US10692202B2 (en) | Flat surface detection in photographs for tamper detection | |
CN109361853B (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN111885312B (en) | HDR image imaging method, system, electronic device and storage medium | |
CN116744120B (en) | Image processing method and electronic device | |
CN108513069A (en) | Image processing method, device, storage medium and electronic equipment | |
WO2023226612A1 (en) | Exposure parameter determining method and apparatus | |
CN113596357B (en) | Image signal processor, image signal processing device and method, chip and terminal equipment | |
CN113286094A (en) | Automatic image exposure method, device, equipment and medium | |
CN116416122B (en) | Image processing method and related device | |
CN111917995B (en) | Image processing method, image processing device, electronic equipment and storage medium | |
US20150172613A1 (en) | Image processing method and image processing device | |
US20170163852A1 (en) | Method and electronic device for dynamically adjusting gamma parameter | |
JP2020504969A (en) | Intelligent shooting method and device, intelligent terminal | |
CN111726530A (en) | Method, device and equipment for acquiring multiple paths of video streams | |
WO2023000878A1 (en) | Photographing method and apparatus, and controller, device and computer-readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |