CN107392870B - Image processing method, image processing device, mobile terminal and computer readable storage medium - Google Patents

Image processing method, image processing device, mobile terminal and computer readable storage medium Download PDF

Info

Publication number
CN107392870B
CN107392870B CN201710625428.9A CN201710625428A CN107392870B CN 107392870 B CN107392870 B CN 107392870B CN 201710625428 A CN201710625428 A CN 201710625428A CN 107392870 B CN107392870 B CN 107392870B
Authority
CN
China
Prior art keywords
image
processed
defogging
rgb
brightness value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710625428.9A
Other languages
Chinese (zh)
Other versions
CN107392870A (en
Inventor
袁全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710625428.9A priority Critical patent/CN107392870B/en
Publication of CN107392870A publication Critical patent/CN107392870A/en
Application granted granted Critical
Publication of CN107392870B publication Critical patent/CN107392870B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/73
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Abstract

The embodiment of the invention relates to an image processing method, an image processing device, a mobile terminal and a computer readable storage medium. The method comprises the following steps: determining the exposure of the first image to be processed; when the exposure is larger than a preset threshold, reducing the brightness value of the first image to be processed to obtain a second image to be processed; calculating the defogging parameters of the second image to be processed according to the reduced brightness value; and carrying out defogging treatment on the first image to be treated according to the defogging parameters. The image processing method, the image processing device, the mobile terminal and the computer readable storage medium can be used for defogging the image with high exposure degree and improving the defogging effect of the image.

Description

Image processing method, image processing device, mobile terminal and computer readable storage medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to an image processing method and apparatus, a mobile terminal, and a computer-readable storage medium.
Background
In foggy weather, the imaging equipment is affected by suspended particles in the air, so that the characteristics of the collected images, such as color, texture and the like, are seriously weakened, the definition of the images is often low, and the integral tone of the images tends to be grayed. In order to make the image containing fog clearer, the image containing fog can be subjected to defogging treatment. The traditional defogging algorithm can comprise a dark channel prior algorithm and the like, and when the exposure of the image is high, the dark channel prior algorithm can be disabled, and the defogging effect of the image is poor.
Disclosure of Invention
The embodiment of the invention provides an image processing method, an image processing device, a mobile terminal and a computer readable storage medium, which can be used for defogging an image with high exposure and improving the defogging effect of the image.
An image processing method comprising:
determining the exposure of the first image to be processed;
when the exposure is larger than a preset threshold, reducing the brightness value of the first image to be processed to obtain a second image to be processed;
calculating the defogging parameters of the second image to be processed according to the reduced brightness value;
and carrying out defogging treatment on the first image to be treated according to the defogging parameters.
In one embodiment, the reducing the brightness value of the first image to be processed to obtain a second image to be processed includes:
extracting image features of the first image to be processed;
analyzing the image characteristics through a pre-established image analysis model to determine the image type of the first image to be processed;
acquiring a reference brightness value matched with the image type;
and reducing the brightness value of the first image to be processed to the reference brightness value to obtain a second image to be processed.
In one embodiment, the calculating the defogging parameter of the second image to be processed according to the reduced brightness value includes:
determining an atmospheric light value according to the reduced brightness value;
calculating the original transmittance according to the atmospheric light value;
and calculating the wave band transmissivity respectively corresponding to the RGB wave bands in the second image to be processed according to the original transmissivity.
In one embodiment, the calculating, according to the original transmittance, band transmittances in the second image to be processed corresponding to three RGB bands, respectively, includes:
acquiring adjustment coefficients respectively corresponding to the RGB three wave bands in the second image to be processed;
and respectively calculating the wave band transmissivity corresponding to the RGB three wave bands according to the original transmissivity and the adjusting coefficient.
In one embodiment, the performing the defogging processing on the first image to be processed according to the defogging parameters includes:
and carrying out defogging treatment on the RGB three wave bands of the first image to be treated according to the wave band transmissivity respectively corresponding to the RGB three wave bands.
An image processing apparatus comprising:
an exposure determining module for determining the exposure of the first image to be processed;
the reducing module is used for reducing the brightness value of the first image to be processed to obtain a second image to be processed when the exposure is greater than a preset threshold;
the computing module is used for computing the defogging parameters of the second image to be processed according to the reduced brightness values;
and the defogging module is used for defogging the first image to be processed according to the defogging parameters.
In one embodiment, the lowering module includes:
the extraction unit is used for extracting the image characteristics of the first image to be processed;
the analysis unit is used for analyzing the image characteristics through a pre-established image analysis model and determining the image type of the first image to be processed;
a reference luminance value acquisition unit configured to acquire a reference luminance value matching the image type;
and the reducing unit is used for reducing the brightness value of the first image to be processed to the reference brightness value to obtain a second image to be processed.
In one embodiment, the calculation module includes:
an atmospheric light value determination unit for determining an atmospheric light value based on the reduced brightness value;
the calculating unit is used for calculating the original transmittance according to the atmospheric light value;
the wave band transmissivity calculating unit is used for calculating wave band transmissivity which corresponds to the RGB three wave bands in the second image to be processed respectively according to the original transmissivity;
the band transmittance calculation unit includes:
an adjustment coefficient obtaining subunit, configured to obtain adjustment coefficients corresponding to the three RGB bands in the second image to be processed, respectively;
the calculating subunit is used for calculating the wave band transmissivity corresponding to the RGB three wave bands respectively according to the original transmissivity and the adjusting coefficient;
and the defogging module is also used for defogging the RGB three wave bands of the first image to be processed according to the wave band transmissivity respectively corresponding to the RGB three wave bands.
A mobile terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method as described above when executing the program.
A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method as set forth above.
According to the image processing method, the image processing device, the mobile terminal and the computer readable storage medium, when the exposure of the first image to be processed is larger than the preset threshold, the brightness value of the first image to be processed is reduced to obtain the second image to be processed, the defogging parameter of the second image to be processed is calculated according to the reduced brightness value, and then the defogging parameter is used for defogging the first image to be processed, so that the image with high exposure can be defogged, and the defogging effect of the image is improved.
Drawings
FIG. 1 is a block diagram of mobile terminals according to one embodiment;
FIG. 2 is a flow diagram illustrating a method for image processing according to one embodiment;
FIG. 3 is a flow chart illustrating reducing luminance values of a first image to be processed according to an embodiment;
FIG. 4 is a flowchart illustrating a process of calculating a defogging parameter of a second image to be processed according to the reduced brightness values in one embodiment;
FIG. 5 is a schematic diagram illustrating a process of calculating transmittance of wavelength bands corresponding to RGB three wavelength bands in a second image to be processed according to an embodiment;
FIG. 6 is a block diagram of an image processing apparatus in one embodiment;
FIG. 7 is a block diagram of a reduction module in one embodiment;
FIG. 8 is a block diagram of a compute module in one embodiment;
FIG. 9 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present invention. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is a block diagram of a mobile terminal in one embodiment. As shown in fig. 1, the mobile terminal includes a processor, a non-volatile storage medium, an internal memory and a network interface, a display screen, and an input device, which are connected through a system bus. The non-volatile storage medium of the mobile terminal stores an operating system and computer-executable instructions, and the computer-executable instructions are executed by the processor to implement the image processing method provided by the embodiment of the invention. The processor is used to provide computing and control capabilities to support the operation of the entire mobile terminal. The internal memory in the mobile terminal provides an environment for the execution of computer-readable instructions in the non-volatile storage medium. The network interface is used for network communication with the server. The display screen of the mobile terminal can be a liquid crystal display screen or an electronic ink display screen, and the input device can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a shell of the mobile terminal, or an external keyboard, a touch pad or a mouse. The mobile terminal can be a mobile phone, a tablet computer, a personal digital assistant or a wearable device. Those skilled in the art will appreciate that the architecture shown in fig. 1 is only a block diagram of a portion of the architecture associated with the subject application and does not constitute a limitation on the mobile terminal to which the subject application applies, and that a particular mobile terminal may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
As shown in fig. 2, in one embodiment, there is provided an image processing method including the steps of:
in step 210, the exposure level of the first image to be processed is determined.
In this embodiment, the first image to be processed refers to an image containing fog, and in foggy weather, there are many particles such as water droplets in the atmosphere, and the farther the object is from an imaging device, such as a camera, a video camera, or the like, the greater the influence of the atmospheric particles on imaging, and the foggy image generally has problems of low contrast, low saturation, hue shift, and the like due to the influence of the atmospheric particles. The mobile terminal can acquire a first image to be processed and determine the exposure of the first image to be processed, wherein the exposure generally refers to the light quantity irradiated on a photosensitive element by entering a lens in the photographing process and can be controlled by the combination of an aperture, a shutter and sensitivity.
In one embodiment, the mobile terminal may draw a histogram according to the first to-be-processed image, where a horizontal axis of the histogram may represent a brightness value and a vertical axis of the histogram may represent the number of pixel points. Through the histogram, the number of the pixel points distributed in each brightness interval in the first image to be processed can be intuitively obtained. The mobile terminal can obtain information such as a brightness average value and a brightness intermediate value of all pixels of the first image to be processed according to the histogram, and determine the exposure of the first image to be processed according to the information such as the brightness average value and the brightness intermediate value, wherein the brightness average value refers to an average value of brightness values of all pixels of the first image to be processed, and the brightness intermediate value can be obtained by arranging the brightness values of all pixels of the first image to be processed from small to large and then taking the most intermediate value.
In other embodiments, the mobile terminal may also calculate the exposure level of the first to-be-processed image according to the aperture size, the exposure time, and the like when the first to-be-processed image is captured, and the method is not limited to the above.
And step 220, when the exposure is greater than a preset threshold, reducing the brightness value of the first image to be processed to obtain a second image to be processed.
The mobile terminal can judge whether the exposure of the first image to be processed is larger than a preset threshold, and when the exposure of the first image to be processed is larger than the preset threshold, the exposure of the first image to be processed can be considered to be too high, and the preset threshold can be set according to actual requirements. When the exposure of the first image to be processed is too high, it is indicated that the brightness value of the first image to be processed is too high, and in defogging algorithms such as a dark primary color prior algorithm, an atmospheric light value needs to be calculated according to the brightness value of the first image to be processed, so as to perform defogging. Therefore, when the exposure of the first to-be-processed image is greater than the preset threshold, the brightness value of the first to-be-processed image can be adjusted to be reduced to the preset reference brightness value, and the second to-be-processed image with the preset reference brightness value is obtained.
And step 230, calculating the defogging parameters of the second image to be processed according to the reduced brightness value.
In this embodiment, the mobile terminal may perform defogging processing on the first image to be processed through a dark primary color prior algorithm, where the dark primary color prior algorithm belongs to a defogging algorithm based on image restoration. The dark channel prior algorithm adopts an atmospheric scattering model to describe the fog-containing image, and the atmospheric scattering model can be shown as formula (1):
I(x)=J(x)t(x)+A(1-t(x)) (1);
wherein, i (x) represents a fog-containing image which needs to be subjected to defogging treatment, j (x) represents a fog-free image obtained after the fog-containing image is subjected to defogging treatment, x represents a spatial position of a certain pixel in the image, t (x) represents transmittance, and a represents an atmospheric light value. For fog-free images, some pixels will always have at least one color channel in the three RGB (red, green, blue color space) channels with a very low value, the value of which is close to zero. Thus, for any image, its dark channel image can be as shown in equation (2):
Figure BDA0001362699900000061
wherein, Jdark(x) Indicating a dark channelImage, Jc(y) represents the value of the color channel and Ω (x) represents a window centered on pixel x. According to the formula (1) and the formula (2), the calculation formula of the transmittance can be derived as shown in the formula (3):
Figure BDA0001362699900000062
in real life, even in a fine day, there are some particles in the air, the object far away can still feel the existence of fog, and the existence of fog can make people feel the existence of depth of field, therefore, a factor between [0 and 1] can be introduced to adjust the obtained transmittance, and the transmittance can be calculated by formula (3) as formula (4):
Figure BDA0001362699900000071
in this embodiment, ω represents a factor for adjusting the transmittance, and ω may have a value of 0.95 or other values, but is not limited thereto, and smaller ω represents smaller defogging degree, and larger ω represents larger defogging degree.
The mobile terminal may calculate a defogging parameter of the second image to be processed according to the reduced brightness value, wherein the defogging parameter may include an atmospheric light value, a transmittance, and the like of the second image to be processed. The mobile terminal can obtain the dark channel image of the second image to be processed according to the formula (2) and obtain the atmospheric light value, wherein the mobile terminal can sort the pixel points of the dark channel image of the second image to be processed according to the brightness, extract the first 0.1% of the pixel points according to the brightness from large to small, determine the brightness value of the position corresponding to the extracted pixel point in the second image to be processed, and take the brightness value of the pixel point with the highest brightness value as the atmospheric light value. After the mobile terminal obtains the atmospheric light value, the transmittance of the second image to be processed can be calculated according to the formula (4).
And 240, carrying out defogging processing on the first image to be processed according to the defogging parameters.
After the mobile terminal calculates the defogging parameters such as the atmospheric light value, the transmissivity and the like of the second image to be processed, the defogging processing can be carried out on the first image to be processed according to the defogging parameters obtained by calculation. And (3) taking the first image to be processed as I (x) in the formula (1), and substituting the atmospheric light value and the transmittance into the formula (1) to obtain an image J (x) after defogging processing. For the fog-containing image with the excessively high exposure degree, the brightness value is reduced firstly, then the defogging parameter is obtained according to the reduced brightness value, the problem that the defogging algorithms such as the dark channel prior algorithm and the like are invalid due to the fact that the defogging parameters such as the atmospheric light value, the transmissivity and the like obtained by the excessively high exposure degree of the fog-containing image are inaccurate can be solved, then the fog-containing image with the excessively high exposure degree is defogged according to the defogging parameter calculated after the brightness value is reduced, and the defogging effect of the fog-containing image with the excessively high exposure degree can be effectively improved.
According to the image processing method, when the exposure of the first image to be processed is larger than the preset threshold, the brightness value of the first image to be processed is reduced to obtain the second image to be processed, the defogging parameter of the second image to be processed is calculated according to the reduced brightness value, and then the defogging parameter is utilized to perform defogging on the first image to be processed, so that the image with high exposure can be defogged, and the defogging effect of the image is improved.
As shown in fig. 3, in an embodiment, the step 220 of reducing the brightness value of the first image to be processed when the exposure level is greater than the preset threshold to obtain the second image to be processed includes the following steps:
step 302, extracting image features of a first image to be processed.
The mobile terminal can perform image recognition on the first image to be processed, can perform segmentation on the first image to be processed, segments the first image to be processed into a plurality of regions with equal size, and detects image features contained in each region one by one. The image features may include shape features, spatial features, edge features, and the like, where the shape features refer to local shapes in the first image to be processed, the spatial features refer to spatial positions or relative directional relationships among a plurality of regions segmented from the first image to be processed, and the edge features refer to boundary pixels constituting two regions in the first image to be processed.
And step 304, analyzing the image characteristics through a pre-established image analysis model, and determining the image type of the first image to be processed.
The mobile terminal can extract the detected image characteristics of the first image to be processed and analyze the image characteristics through a pre-established image analysis model. The image analysis model may be a decision model constructed in advance through machine learning, and the mobile terminal may determine an image type of the first image to be processed through the image analysis model, where the image type may include a fog-containing image of a person, a fog-containing image of an unmanned landscape, and the like, but is not limited thereto. When the image analysis model is constructed, a large number of sample images with image type labels can be obtained, the sample images are used as input of the image analysis model, and the image characteristics corresponding to each branch node and the image type corresponding to each leaf node in the image analysis model are determined through machine learning.
Step 306, obtaining a reference brightness value matched with the image type.
The mobile terminal analyzes the image characteristics of the extracted first image to be processed through the image analysis model, can determine a leaf node where the first image to be processed is located, thereby determining the image type of the first image to be processed, and can acquire a reference brightness value matched with the leaf node from the image analysis model. The reference luminance value may be an average luminance value of all sample images classified into a leaf node where the first to-be-processed image is located when the image analysis model is constructed. It is to be understood that other ways to obtain the reference brightness value matching the image type may also be used, for example, a corresponding relationship between the image type and the reference brightness value is established in advance, and the reference brightness value is a preset empirical value, and the like, and is not limited to the above-mentioned ways.
And 308, reducing the brightness value of the first image to be processed to a reference brightness value to obtain a second image to be processed.
The mobile terminal can adjust the brightness value of the first image to be processed, and reduce the brightness value of the first image to be processed to the reference brightness value matched with the type of the image to which the first image to be processed belongs to obtain a second image to be processed, wherein the second image to be processed is the image of the first image to be processed after the brightness value is reduced.
In the embodiment, the image type can be determined according to the image characteristics of the first image to be processed, and the brightness value of the first image to be processed is reduced to the reference brightness value matched with the image type, so that the calculated defogging parameter can be more accurate, and the defogging effect of the image can be effectively improved.
As shown in fig. 4, in an embodiment, the step 230 of calculating the defogging parameter of the second image to be processed according to the reduced brightness value includes the following steps:
step 402, determining an atmospheric light value according to the reduced brightness value.
Step 404, the original transmittance is obtained according to the atmospheric light value.
And the mobile terminal reduces the brightness value of the first image to be processed to obtain a second image to be processed, then can calculate the atmospheric light value of the second image to be processed, and can calculate the original transmittance of the second image to be processed according to the atmospheric light value. In the present embodiment, the original transmittance refers to the transmittance of the second to-be-processed image as a whole.
And 406, calculating the wave band transmittances corresponding to the three RGB wave bands in the second image to be processed according to the original transmittances.
Because the influence of the fog on the three RGB bands is different, if the fog removal processing is performed on the three RGB bands to the same degree, the fog in the green band and the blue band may not be completely removed, which may cause the image obtained after the fog removal processing to be bluish and cause the color distortion problem. Aiming at the three RGB wave bands, adjusting coefficients corresponding to the three RGB wave bands can be respectively introduced, and wave band transmissivity corresponding to the three RGB wave bands in the second image to be processed is recalculated to be t (r), t (g) and t (b) according to the adjusting coefficients.
In one embodiment, the step 240 performs the defogging process on the first image to be processed according to the defogging parameters, including: and carrying out defogging treatment on the RGB three wave bands of the first image to be treated according to the wave band transmissivity respectively corresponding to the RGB three wave bands.
For the fog with the same concentration, the influence on the three RGB wave bands is increased progressively, so that in the wave band transmissivity which corresponds to the three RGB wave bands in the second image to be processed, the wave band transmissivity t (R) of the R wave band is greater than the wave band transmissivity t (G) of the G wave band, the wave band transmissivity t (G) of the G wave band is greater than the wave band transmissivity t (B) of the B wave band, and the different wave band transmissivity represents that the defogging processing intensity is different. The mobile terminal can perform defogging processing with different degrees on the RGB three wave bands of the first image to be processed respectively according to the wave band transmissivity respectively corresponding to the RGB three wave bands in the second image to be processed. The wave band transmissivity t (R), t (G) and t (B) corresponding to the three RGB wave bands can be respectively brought into formula (1), and the values J (R), J (G) and J (B) of the haze-free image obtained after the haze-free image is subjected to the haze removal treatment on the three RGB channels are respectively obtained, wherein the haze removal treatment intensity of the three RGB wave bands is increased progressively, namely the haze removal treatment intensity of the R wave band is smaller than the haze removal treatment intensity of the G wave band, and the haze removal intensity of the G wave band is smaller than the haze removal treatment intensity of the B wave band. After the mobile terminal performs defogging processing on the RGB three wave bands of the first image to be processed, the values j (r), j (g), and j (b) of the RGB three channels after the defogging processing can be synthesized to obtain a fog-free image j (x).
In the embodiment, the defogging processing with different intensities can be performed on the three RGB bands of the first image to be processed, so that the fog in the image can be effectively removed, and meanwhile, the problems of bluish image and color distortion after defogging by using the traditional defogging algorithm can be solved, so that the color of the defogged image is more natural and real.
As shown in fig. 5, in one embodiment, the step 406 of calculating the transmittance of the wavelength bands corresponding to the three RGB wavelength bands in the second image to be processed according to the original transmittance includes the following steps:
step 502, obtaining adjustment coefficients corresponding to the three RGB bands in the second image to be processed.
The mobile terminal can obtain preset adjusting coefficients corresponding to RGB three wave bands in the second image to be processed respectively, wherein the adjusting coefficient of the R wave band is larger than that of the G wave band, and the adjusting coefficient of the G wave band is larger than that of the B wave bandAnd (4) counting. In one embodiment, the adjustment factor W of the R bandrAdjustment coefficient W of 1, G wave bandgAnd the adjustment coefficient W of B wave bandbCan be calculated according to the formula (5) and the formula (6):
Wg=(0.9+0.1*t)2(5);
Wb=(0.7+0.3*t)2(6);
where t denotes the original transmittance of the second image to be processed.
Step 504, respectively calculating the wave band transmittances corresponding to the three RGB wave bands according to the original transmittances and the adjusting coefficients.
The mobile terminal can multiply the adjusting coefficients respectively corresponding to the three RGB bands by the original transmittance, so as to calculate the transmittance of the corresponding band, and the transmittance of the RGB bands can be calculated as shown in formula (7):
t(r)=Wr*t
t(g)=Wg*t
t(b)=Wb*t (7)。
it should be understood that the adjustment coefficients for the three RGB bands are not limited to the calculation of the above equations (5) and (6), and the transmittance of the band is not limited to the calculation of the above equation (7), but may be calculated in other manners.
In this embodiment, adjustment coefficients corresponding to three RGB bands are introduced, the band transmittances corresponding to the three RGB bands in the second image to be processed are calculated according to the adjustment coefficients, and then the defogging processing with different intensities is performed on the three RGB bands of the first image to be processed according to the calculated band transmittances, so that the fog in the image can be effectively removed, and meanwhile, the problems of bluish image and color distortion after defogging by using a conventional defogging algorithm can be solved, so that the color of the defogged image is more natural and real.
As shown in fig. 6, in one embodiment, an image processing apparatus 600 is provided, which includes an exposure level determining module 610, a reducing module 620, a calculating module 630, and a defogging module 640.
An exposure determining module 610, configured to determine an exposure of the first image to be processed.
And a reducing module 620, configured to reduce the brightness value of the first image to be processed when the exposure level is greater than a preset threshold, so as to obtain a second image to be processed.
And a calculating module 630, configured to calculate a defogging parameter of the second image to be processed according to the reduced brightness value.
And the defogging module 640 is used for performing defogging processing on the first image to be processed according to the defogging parameters.
According to the image processing device, when the exposure of the first image to be processed is larger than the preset threshold, the brightness value of the first image to be processed is reduced to obtain the second image to be processed, the defogging parameter of the second image to be processed is calculated according to the reduced brightness value, and then the defogging parameter is utilized to perform defogging processing on the first image to be processed, so that the image with high exposure can be defogged, and the defogging effect of the image is improved.
As shown in fig. 7, in one embodiment, the reduction module 620 includes an extraction unit 622, an analysis unit 624, a reference brightness value obtaining unit 626, and a reduction unit 628.
The extracting unit 622 is configured to extract image features of the first image to be processed.
The analyzing unit 624 is configured to analyze the image features through a pre-established image analysis model to determine an image type of the first image to be processed.
A reference luminance value obtaining unit 626 for obtaining a reference luminance value matching the image type.
The reducing unit 628 is configured to reduce the luminance value of the first image to be processed to a reference luminance value, so as to obtain a second image to be processed.
In the embodiment, the image type can be determined according to the image characteristics of the first image to be processed, and the brightness value of the first image to be processed is reduced to the reference brightness value matched with the image type, so that the calculated defogging parameter can be more accurate, and the defogging effect of the image can be effectively improved.
As shown in fig. 8, in one embodiment, the calculation module 630 includes an atmospheric light value determination unit 632, an evaluation unit 634, and a band transmittance calculation unit 636.
An atmospheric light value determining unit 632, configured to determine an atmospheric light value according to the reduced brightness value.
And an obtaining unit 634, configured to obtain an original transmittance according to the atmospheric light value.
And the waveband transmittance calculating unit 636 is configured to calculate, according to the original transmittance, waveband transmittances in the second image to be processed, which correspond to the three RGB wavebands, respectively.
In an embodiment, the defogging module 640 is further configured to perform defogging processing on the RGB three bands of the first image to be processed according to the band transmittances corresponding to the RGB three bands, respectively.
In the embodiment, the defogging processing with different intensities can be performed on the three RGB bands of the first image to be processed, so that the fog in the image can be effectively removed, and meanwhile, the problems of bluish image and color distortion after defogging by using the traditional defogging algorithm can be solved, so that the color of the defogged image is more natural and real.
In one embodiment, the band transmittance calculation unit 636 includes an adjustment coefficient acquisition subunit and a calculation subunit.
And the adjusting coefficient acquiring subunit is used for acquiring adjusting coefficients respectively corresponding to the three RGB wave bands in the second image to be processed.
And the calculating subunit is used for respectively calculating the wave band transmissivity corresponding to the three RGB wave bands according to the original transmissivity and the adjusting coefficient.
In this embodiment, adjustment coefficients corresponding to three RGB bands are introduced, the band transmittances corresponding to the three RGB bands in the second image to be processed are calculated according to the adjustment coefficients, and then the defogging processing with different intensities is performed on the three RGB bands of the first image to be processed according to the calculated band transmittances, so that the fog in the image can be effectively removed, and meanwhile, the problems of bluish image and color distortion after defogging by using a conventional defogging algorithm can be solved, so that the color of the defogged image is more natural and real.
The embodiment of the invention also provides the mobile terminal. The mobile terminal includes an Image processing circuit, which may be implemented using hardware and/or software components, and may include various processing units defining an ISP (Image signal processing) pipeline. FIG. 9 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 9, for convenience of explanation, only aspects of the image processing technique related to the embodiment of the present invention are shown.
As shown in fig. 9, the image processing circuit includes an ISP processor 940 and a control logic 950. The image data captured by the imaging device 910 is first processed by the ISP processor 940, and the ISP processor 940 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of the imaging device 910. The imaging device 910 may include a camera having one or more lenses 912 and an image sensor 914. Image sensor 914 may include an array of color filters (e.g., Bayer filters), and image sensor 914 may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 914 and provide a set of raw image data that may be processed by ISP processor 940. The sensor 920 may provide the raw image data to the ISP processor 940 based on the type of interface of the sensor 920. The sensor 920 interface may utilize a SMIA (Standard Mobile imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
The ISP processor 940 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 940 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 940 may also receive pixel data from image memory 930. For example, raw pixel data is sent from the sensor 920 interface to the image memory 930, and the raw pixel data in the image memory 930 is then provided to the ISP processor 940 for processing. The image Memory 930 may be a part of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from the sensor 920 interface or from the image memory 930, the ISP processor 940 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 930 for additional processing before being displayed. ISP processor 940 may also receive processed data from image memory 930 for image data processing in the raw domain and in the RGB and YCbCr color spaces. The processed image data may be output to a display 980 for viewing by a user and/or further processing by a graphics engine or GPU (graphics processing Unit). Further, the output of ISP processor 940 may also be sent to image memory 930 and display 980 may read image data from image memory 930. In one embodiment, image memory 930 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 940 may be transmitted to an encoder/decoder 970 for encoding/decoding image data. The encoded image data may be saved and decompressed prior to display on a display 980 device.
The step of the ISP processor 940 processing the image data includes: the image data is subjected to VFE (Video FrontEnd) Processing and CPP (Camera Post Processing). The VFE processing of the image data may include modifying the contrast or brightness of the image data, modifying digitally recorded lighting status data, performing compensation processing (e.g., white balance, automatic gain control, gamma correction, etc.) on the image data, performing filter processing on the image data, etc. CPP processing of image data may include scaling an image, providing a preview frame and a record frame to each path. Among other things, the CPP may use different codecs to process the preview and record frames.
The image data processed by the ISP processor 940 may be transmitted to a defogging module 960 to defogge the image before being displayed. The defogging module 960 can reduce the brightness value of the first image to be processed to obtain a second image to be processed, calculate the defogging parameter of the second image to be processed according to the reduced brightness value, and perform the defogging process and the like on the first image to be processed according to the calculated defogging parameter. The defogging module 960 may be a Central Processing Unit (CPU), a GPU, a coprocessor, or the like. After the defogging module 960 defogges the image data, the defogged image data may be transmitted to the encoder/decoder 970 to encode/decode the image data. The encoded image data may be saved and decompressed prior to display on a display 980 device. It is understood that the image data processed by the defogging module 960 may be sent directly to the display 980 for display without passing through the encoder/decoder 970. The image data processed by the ISP processor 940 may also be processed by the encoder/decoder 970 and then processed by the defogging module 960. The encoder/decoder can be a CPU, a GPU, a coprocessor or the like in the mobile terminal.
The statistical data determined by the ISP processor 940 may be transmitted to the control logic 950 unit. For example, the statistical data may include image sensor 914 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 912 shading correction, and the like. The control logic 950 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of the imaging device 910 and control parameters of the ISP processor 940 based on the received statistical data. For example, the control parameters may include sensor 920 control parameters (e.g., gain, integration time for exposure control), camera flash control parameters, lens 912 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 912 shading correction parameters.
In the present embodiment, the image processing method described above can be realized by using the image processing technique in fig. 9.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the above-mentioned image processing method.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a Read-only memory (ROM), or the like.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. An image processing method, comprising:
determining the exposure of the first image to be processed;
when the exposure is larger than a preset threshold, reducing the brightness value of the first image to be processed to obtain a second image to be processed;
calculating the defogging parameters of the second image to be processed according to the reduced brightness value; the defogging parameters comprise an atmospheric light value and a transmissivity;
and carrying out defogging treatment on the first image to be treated according to the defogging parameters.
2. The method according to claim 1, wherein the reducing the brightness value of the first image to be processed to obtain a second image to be processed comprises:
extracting image features of the first image to be processed;
analyzing the image characteristics through a pre-established image analysis model to determine the image type of the first image to be processed;
acquiring a reference brightness value matched with the image type;
and reducing the brightness value of the first image to be processed to the reference brightness value to obtain a second image to be processed.
3. The method according to claim 1, wherein said calculating the defogging parameters of the second image to be processed according to the reduced brightness values comprises:
determining an atmospheric light value according to the reduced brightness value;
calculating the original transmittance according to the atmospheric light value;
and calculating the wave band transmissivity respectively corresponding to the RGB wave bands in the second image to be processed according to the original transmissivity.
4. The method according to claim 3, wherein said calculating the transmittance of the second image to be processed in the wavelength bands corresponding to the three RGB wavelength bands according to the original transmittance comprises:
acquiring adjustment coefficients respectively corresponding to the RGB three wave bands in the second image to be processed;
and respectively calculating the wave band transmissivity corresponding to the RGB three wave bands according to the original transmissivity and the adjusting coefficient.
5. The method according to claim 3 or 4, wherein the defogging processing on the first image to be processed according to the defogging parameters comprises:
and carrying out defogging treatment on the RGB three wave bands of the first image to be treated according to the wave band transmissivity respectively corresponding to the RGB three wave bands.
6. An image processing apparatus characterized by comprising:
an exposure determining module for determining the exposure of the first image to be processed;
the reducing module is used for reducing the brightness value of the first image to be processed to obtain a second image to be processed when the exposure is greater than a preset threshold;
the computing module is used for computing the defogging parameters of the second image to be processed according to the reduced brightness values; the defogging parameters comprise an atmospheric light value and a transmissivity;
and the defogging module is used for defogging the first image to be processed according to the defogging parameters.
7. The apparatus of claim 6, wherein the lowering module comprises:
the extraction unit is used for extracting the image characteristics of the first image to be processed;
the analysis unit is used for analyzing the image characteristics through a pre-established image analysis model and determining the image type of the first image to be processed;
a reference luminance value acquisition unit configured to acquire a reference luminance value matching the image type;
and the reducing unit is used for reducing the brightness value of the first image to be processed to the reference brightness value to obtain a second image to be processed.
8. The apparatus of claim 6, wherein the computing module comprises:
an atmospheric light value determination unit for determining an atmospheric light value based on the reduced brightness value;
the calculating unit is used for calculating the original transmittance according to the atmospheric light value;
and the wave band transmissivity calculating unit is used for calculating wave band transmissivity respectively corresponding to the RGB wave bands in the second image to be processed according to the original transmissivity.
9. The apparatus of claim 8, wherein the band transmittance calculation unit comprises:
an adjustment coefficient obtaining subunit, configured to obtain adjustment coefficients corresponding to the three RGB bands in the second image to be processed, respectively;
and the calculating subunit is used for calculating the wave band transmissivity corresponding to the RGB three wave bands respectively according to the original transmissivity and the adjusting coefficient.
10. The apparatus according to claim 8 or 9,
and the defogging module is also used for defogging the RGB three wave bands of the first image to be processed according to the wave band transmissivity respectively corresponding to the RGB three wave bands.
11. A mobile terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor when executing the program implementing the method according to any of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1 to 5.
CN201710625428.9A 2017-07-27 2017-07-27 Image processing method, image processing device, mobile terminal and computer readable storage medium Active CN107392870B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710625428.9A CN107392870B (en) 2017-07-27 2017-07-27 Image processing method, image processing device, mobile terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710625428.9A CN107392870B (en) 2017-07-27 2017-07-27 Image processing method, image processing device, mobile terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN107392870A CN107392870A (en) 2017-11-24
CN107392870B true CN107392870B (en) 2020-07-21

Family

ID=60342777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710625428.9A Active CN107392870B (en) 2017-07-27 2017-07-27 Image processing method, image processing device, mobile terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN107392870B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108171667A (en) * 2017-12-29 2018-06-15 努比亚技术有限公司 A kind of image processing method, terminal and computer readable storage medium
EP3871406B1 (en) 2018-12-29 2024-03-06 Zhejiang Dahua Technology Co., Ltd. Systems and methods for exposure control
CN112598586B (en) * 2020-12-11 2022-11-11 青岛海信移动通信技术股份有限公司 Foggy day image display method and terminal

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104134194A (en) * 2014-07-23 2014-11-05 中国科学院深圳先进技术研究院 Image defogging method and image defogging system
CN105023256A (en) * 2015-08-13 2015-11-04 丘璇 Image defogging method and system
CN105447825A (en) * 2015-10-08 2016-03-30 湖北大学 Image defogging method and system
CN105631823A (en) * 2015-12-28 2016-06-01 西安电子科技大学 Dark channel sky area defogging method based on threshold segmentation optimization
CN106023118A (en) * 2016-06-13 2016-10-12 凌云光技术集团有限责任公司 Image defogging method and realization method on FPGA

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104134194A (en) * 2014-07-23 2014-11-05 中国科学院深圳先进技术研究院 Image defogging method and image defogging system
CN105023256A (en) * 2015-08-13 2015-11-04 丘璇 Image defogging method and system
CN105447825A (en) * 2015-10-08 2016-03-30 湖北大学 Image defogging method and system
CN105631823A (en) * 2015-12-28 2016-06-01 西安电子科技大学 Dark channel sky area defogging method based on threshold segmentation optimization
CN106023118A (en) * 2016-06-13 2016-10-12 凌云光技术集团有限责任公司 Image defogging method and realization method on FPGA

Also Published As

Publication number Publication date
CN107392870A (en) 2017-11-24

Similar Documents

Publication Publication Date Title
CN107424198B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107451969B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107317967B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN110473185B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108419028B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN107481186B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN107493432B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107395991B (en) Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment
CN107424133B (en) Image defogging method and device, computer storage medium and mobile terminal
CN107563976B (en) Beauty parameter obtaining method and device, readable storage medium and computer equipment
CN107993209B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN107368806B (en) Image rectification method, image rectification device, computer-readable storage medium and computer equipment
CN107862658B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN109242794B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107704798B (en) Image blurring method and device, computer readable storage medium and computer device
CN107392870B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107945106B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN108259754B (en) Image processing method and device, computer readable storage medium and computer device
CN107454317B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN107454318B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN108737797B (en) White balance processing method and device and electronic equipment
CN107194901B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN107578372B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN109191398B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN107424134B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant