CN111405177A - Image processing method, terminal and computer readable storage medium - Google Patents

Image processing method, terminal and computer readable storage medium Download PDF

Info

Publication number
CN111405177A
CN111405177A CN202010157432.9A CN202010157432A CN111405177A CN 111405177 A CN111405177 A CN 111405177A CN 202010157432 A CN202010157432 A CN 202010157432A CN 111405177 A CN111405177 A CN 111405177A
Authority
CN
China
Prior art keywords
brightness
brightness value
preview image
processing method
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010157432.9A
Other languages
Chinese (zh)
Other versions
CN111405177B (en
Inventor
李逸超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010157432.9A priority Critical patent/CN111405177B/en
Publication of CN111405177A publication Critical patent/CN111405177A/en
Application granted granted Critical
Publication of CN111405177B publication Critical patent/CN111405177B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/634Warning indications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Abstract

The application discloses an image processing method. The image processing method of the embodiment of the application comprises the following steps: acquiring preview images of a plurality of different scenes; detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; and when the coincidence degree between the low-brightness areas of the preview images is higher than a preset threshold value, generating a prompt message that the lens is dirty. The application also provides a terminal and a computer readable storage medium. By detecting the low-brightness areas in the plurality of preview images, if the coincidence degree of the low-brightness areas in the plurality of preview images is higher than a preset threshold value, it is determined that the light of the low-brightness areas is formed due to the blocking of dirt, and at this time, a prompt message that the lens is dirty is generated to notify the user.

Description

Image processing method, terminal and computer readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, a terminal, and a computer-readable storage medium.
Background
When using the camera to take a picture, the function that can detect the camera usually is not intact, can not be used for formation of image, still can detect the camera whether sheltered from, whether can receive light, however, sometimes has dirty on the camera, and the camera can receive light, but dirty can produce negative effect to the formation of image quality of camera, and the user probably does not know this.
Disclosure of Invention
The embodiment of the application provides an image processing method, a terminal and a computer readable storage medium.
The image processing method of the embodiment of the application comprises the following steps: acquiring preview images of a plurality of different scenes; detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; and when the coincidence degree of the low-brightness areas of the preview images is higher than a preset threshold value, generating a prompt message that the lens is dirty.
The terminal of the embodiment of the application comprises a lens and a processor, wherein the processor is used for: acquiring preview images of a plurality of different scenes; detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; and when the coincidence degree of the low-brightness areas of the preview images is higher than a preset threshold value, generating a prompt message that the lens is dirty.
The non-transitory computer-readable storage medium of embodiments of the present application containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform: acquiring preview images of a plurality of different scenes; detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; and when the coincidence degree of the low-brightness areas of the preview images is higher than a preset threshold value, generating a prompt message that the lens is dirty.
In the image processing method, the terminal and the computer-readable storage medium according to the embodiments of the present application, by detecting a low brightness region in a plurality of preview images, if a coincidence degree of the low brightness region in the plurality of preview images is higher than a preset threshold, it is determined that light of the low brightness region is formed due to occlusion by dirt, and at this time, a prompt message indicating that the lens is dirty is generated to notify a user.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal according to an embodiment of the present application;
FIGS. 3a, 3b, and 3c are schematic diagrams of a preview image before processing according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of an image processing method according to an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating the processing effect of the preview image according to the embodiment of the present application;
FIG. 6 is a schematic flow chart of an image processing method according to an embodiment of the present application;
FIG. 7 is a schematic flow chart of an image processing method according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a preview image before processing in accordance with an embodiment of the present application;
FIG. 9 is a schematic flow chart of an image processing method according to an embodiment of the present application;
FIG. 10 is a schematic diagram illustrating the processing effect of the preview image according to the embodiment of the present application;
fig. 11 is a schematic diagram of a connection between a computer-readable storage medium and a processor according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
Referring to fig. 1 and fig. 2, an image processing method according to an embodiment of the present disclosure includes:
01: acquiring preview images of a plurality of different scenes;
02: detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; and
03: when the coincidence degree between the low-brightness regions of the plurality of preview images is higher than a preset threshold value, a notification that the lens 10 is dirty is generated.
The image processing method according to the embodiment of the present application can be applied to the terminal 100 according to the embodiment of the present application, where the terminal 100 includes a lens 10 and a processor 20, and the processor 20 can be configured to implement steps 01, 02, and 03, that is, the processor 20 can be configured to obtain preview images of a plurality of different scenes; detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; when the coincidence degree between the low-brightness regions of the plurality of preview images is higher than a preset threshold value, a notification that the lens 10 is dirty is generated.
Specifically, the terminal 100 may be a mobile phone, a tablet computer, a single lens reflex camera, a laptop computer, a smart watch, smart glasses, a smart headset, or other terminal 100, and the terminal 100 shown in fig. 2 is taken as an example for illustration, and it is understood that the specific form of the terminal 100 is not limited to the mobile phone. The lens 10 may be any device on the terminal 100 for receiving light to perform imaging, for example, the lens 10 may be a front camera, a rear camera, a side camera, a screen camera, etc., without limitation. The processor 20 may be a processor 20 such as an application processor, an image processor, etc. of the terminal 100.
In step 01, preview images of a plurality of different scenes are acquired. The preview image may be a picture obtained by the lens 10 for the user to preview without the user confirming shooting, or a picture obtained by shooting after the user confirming shooting. The plurality may be any number greater than one of two, three, four, five, etc. Different scenes refer to different pictures in the preview image, including different contents of the pictures, different angles of the pictures, and the like. It is to be understood that the plurality of preview images may be preview images obtained at different times when the lens 10 is once opened, or may be preview images obtained separately when the lens 10 is opened a plurality of times. In the example shown in fig. 3a, 3b, and 3c, the plurality of preview images are preview image P1, preview image P2, and preview image P3, respectively, and the scenes of the plurality of preview images are different.
In step 02, low-luminance regions in each preview image, which have lower luminance values than other regions and are continuously distributed, are detected. The preview image may be an image stored in a YUV format, each pixel in the preview image has information of a brightness value, and the processor 20 may obtain positions and brightness values of all pixels in each preview image, and further determine a distribution condition of a low-brightness region. Specifically, the low-luminance region refers to a region having a lower luminance value than other regions in the same preview image, and the low-luminance region is also a continuous region, that is, each pixel in the low-luminance region is at least adjacent to another pixel in the same low-luminance region. In the example shown in fig. 3a, 3b and 3c, the low-luminance region in the preview image P1 is D1, the low-luminance region in the preview image P2 is D2 and the low-luminance region in the preview image P3 is D3.
In step 03, when the coincidence degree of the low-luminance regions of the plurality of preview images is higher than a preset threshold value, a notification message that the lens 10 is dirty is generated. Because there is when dirty at camera lens 10, dirty can shelter from partial light that gets into camera lens 10, can have the region darker for the rest of areas in the formation of image of camera lens 10, and because dirty position usually can not change in the short time, therefore this darker region probably can all exist when shooing different scenes, and the position is roughly unchangeable, and dirty wherein may be greasy dirt, granule, dust, water stain, oil stain etc. dirty exists in camera lens 10, dirty camera lens 10 exists dirty can be the dirty that exists on the lens group of camera lens 10, also may be the dirty that exists on the apron or the screen that lie in the light path of camera lens 10. If the coincidence degree of the low-brightness regions of the preview images is higher than the preset threshold value, it is considered that the low-brightness regions in the preview images are likely to be caused by the dirt of the lens 10, and at this time, a prompt message that the dirt exists in the lens 10 is generated, so that the user can conveniently process the dirt. The overlap ratio may be a ratio of an area where the low-luminance regions of the different preview images overlap to an area of the low-luminance region, and when the ratio is large, the overlap ratio is large, and when the ratio is small, the overlap ratio is small. The preset threshold may be set when the terminal 100 leaves a factory, or may be adjusted by a user, for example, the preset threshold may be a numerical value such as 0.5, 0.6, or 0.8, which is not limited herein. After generating the notification information that the lens 10 is dirty, the terminal 100 may respond to the dirty notification information, specifically, the notification information may be displayed on a display screen of the terminal 100, or the notification information that the lens 10 is dirty is emitted through a speaker of the terminal 100, or a specific vibration is emitted by the terminal 100 to prompt the user, which is not limited herein.
In summary, in the image processing method and the terminal 100 according to the embodiment of the present invention, by detecting the low luminance areas in the plurality of preview images, if the overlapping degree of the low luminance areas in the plurality of preview images is higher than the preset threshold, it is determined that the light of the low luminance areas is formed due to the blocking of the dirt, and at this time, a prompt message indicating that the lens 10 is dirty is generated to notify the user.
In some embodiments, before implementing step 02, the image processing method further comprises the steps of: a user instruction is received to set a prompt message whether to respond to the contamination of the lens 10. If the user can input an instruction to set that the lens 10 needs to respond to the dirty prompting message, the terminal 100 responds to the prompting message after generating the prompting message that the lens 10 is dirty in step 03, in one example, the terminal 100 defaults to set the prompting message that the user needs to prompt the lens 10 to be dirty. The user may also input an instruction to set the dirty prompting message that does not need to be responded, for example, in some cases, the user may intend to paint a pigment or other foreign matter on the lens 10 to shoot an image with a personality or a special effect, and the user may set the non-response prompting message first to avoid being disturbed by the dirty prompting message when the user subsequently uses the terminal 100, so as to improve the user experience.
Referring to fig. 4, in some embodiments, the image processing method further includes:
04: dividing the preview image into a plurality of brightness areas according to a plurality of preset brightness value intervals, wherein each brightness value interval is associated with a corresponding adjustment proportion; and
05: and adjusting the brightness value of the corresponding brightness area according to the adjustment ratio.
Referring to fig. 2, in some embodiments, the processor 20 may be further configured to perform step 04 and step 05, that is, the processor 20 may be configured to divide the preview image into a plurality of luminance zones according to a plurality of preset luminance value intervals, where each of the luminance value intervals is associated with a corresponding adjustment ratio; and adjusting the brightness value of the corresponding brightness area according to the adjustment proportion.
Due to the fact that the lens 10 is dirty, the influence of the dirt on the preview image may not only be in a low-brightness area, but also the lens 10 may automatically adjust parameters such as corresponding sensitivity and aperture due to the dirt, so that the overall brightness of the preview image is poor in appearance, in addition, the low-brightness area may be prominent due to the obviously low brightness, so that the attractiveness of the preview image is affected, and the brightness of the preview image can be readjusted by processing the preview image through the step 04 and the step 05 to restore the shooting effect without the dirt as much as possible.
Specifically, in step 04, the preview image is divided into a plurality of luminance areas according to a plurality of preset luminance value intervals. For different preview images, the luminance value sections may be different, and the number of the divided luminance regions may also be different, and in one example, three luminance value sections may be set according to the distribution range of the luminance values of all the pixels of the preview image, and the divided luminance regions are divided into three luminance regions, for example, the distribution range of the luminance values of all the pixels of the preview image is [80, 230], the three luminance value sections set may be [80, 130 ], [130, 180 ], [180, 230], and the three divided luminance regions may be a low luminance region, a medium luminance region, and a high luminance region, respectively. Of course, each brightness region may be a complete and continuous region, and each brightness region may also include a plurality of discrete and spaced regions, which is not limited herein. The interval width of each brightness value interval can be the same or different, and the number of brightness value intervals and brightness areas is not limited to the above three discussions, and can be any number greater than or equal to two.
In step 05, the brightness values of the corresponding brightness regions are adjusted by an adjustment ratio, wherein the adjustment ratio associated with the corresponding brightness regions may be partially the same or completely different, and the brightness value of each pixel after adjustment may be represented by a formula, where the adjusted brightness value is the brightness value × before adjustment (1+ adjustment ratio), and the influence of the different brightness regions due to dirt may be compensated by controlling the adjustment ratio to be different, in one example, the adjustment ratio of the low brightness region may be a number greater than zero or a negative number close to zero, the adjustment ratio of the medium brightness region may be a number equal to zero or close to zero, and the adjustment ratio of the high brightness region may be a number less than zero, so as to perform differentiation processing on the brightness values of the pixels of the different brightness regions, for example, the adjustment ratio of the low brightness region, the adjustment ratio of the medium brightness region, and the adjustment ratio of the high brightness region are respectively 0.1, 0 and-0.1, or 0.2, and the adjustment ratio of the high brightness region is respectively 0.1, 0.3, and 0.2, respectively, so that the preview image obtained after the preview image is significantly improved by the preview image processing, the preview image quality compensation results obtained by the preview image processing examples of the preview image before adjustment of the preview image before step 3610, which is significantly improved by the preview image processing example, and the preview image processing example, where the preview image processing example, the preview image quality is significantly improved by the preview image processing example, where the preview image processing example, the preview image processing example.
In another embodiment, the area of the preview image other than the low-luminance area may be divided into a plurality of adjustment areas according to the distance between the pixel in the preview image and the low-luminance area, the low-luminance area and the plurality of adjustment areas are respectively associated with different adjustment coefficients, and the luminance values of the pixels in different areas in the preview image are respectively adjusted by the different adjustment coefficients. Since the low-luminance area may be most affected by the contamination, and in other areas, the area closer to the low-luminance area may be more affected by the contamination, and the distortion of the luminance value is more serious, the other areas may be divided according to the distance from the low-luminance area, and different adjustment ratios may be allocated to restore the luminance distribution of the preview image when there is no contamination as much as possible.
In some embodiments, the terminal 100 may simultaneously display the preview image that has not been processed in steps 01 to 05 and the preview image that has been processed in steps 01 to 05, so that the user can conveniently compare and select the preview images before and after processing, and the user can select one of the preview images to be saved or edited, or select two preview images to be saved or edited, which reflects the will of the user.
Referring to fig. 6, in some embodiments, the image processing method further includes:
06: the contrast of the preview image is improved; and/or
07: and improving the color saturation of the preview image.
Referring to fig. 2, in some embodiments, the processor 20 may be further configured to perform step 06 and/or step 07, that is, the processor 20 may be configured to enhance the contrast of the preview image; and/or processor 20 may be used to enhance the color saturation of the preview image.
By further improving the contrast and/or color saturation of the preview image, defects such as the entire whitish preview image and missing color gradation due to the dirt on the lens 10 can be compensated, and the quality of the preview image can be further improved. Specifically, the processor 20 may generate a first preview image that is processed by increasing the contrast and increasing the color saturation, a second preview image that is processed by increasing the contrast only, and a third preview image that is processed by increasing the color saturation only, so that the user can select which one or more of the first preview image, the second preview image, and the third preview image is/are saved, so as to provide more choices for the user and improve the user experience.
Referring to fig. 7, in some embodiments, the image processing method further includes:
08: detecting a brightness value within a preset range of the periphery of a light source when the light source exists in the preview image; and
09: when a banded region connected with the light source exists in the preset range and the brightness value of the banded region is greater than the brightness value preset brightness threshold value of the rest regions in the preset range, a prompt message that the lens 10 is dirty is generated.
Referring to fig. 2, in some embodiments, the processor 20 may be configured to perform steps 08 and 09, that is, the processor 20 may be configured to detect a brightness value within a predetermined range around a periphery of a light source when the light source exists in a preview image; and generating a prompt message that the lens 10 is dirty when a banded region connected with the light source exists in the preset range and the brightness value of the banded region is greater than the brightness value preset brightness threshold of the rest regions in the preset range.
When the light source exists in the preview image, the brightness of the whole preview image may be relatively high, and it is difficult to distinguish an obvious low-brightness region, and at this time, whether the lens 10 is dirty or not may be determined according to the influence of the dirt on the light source in the preview image. Specifically, since light passes through the lens group and then is usually focused on the image sensor of the lens 10 when there is no dirt, if there is dirt on the lens 10, the dirt may affect the refraction angle of the light, or cause the light to scatter, etc., which may result in that the light may not be focused on the image sensor accurately, and may generate halo, flare, etc., especially for scenes with light sources, the light band may also appear at the light source position, etc. Step 08 and step 09 determine that the lens 10 is dirty by detecting a band-shaped area with a brightness value around the light source greater than that of the other areas, so that the user can know the dirty condition of the lens 10 in advance.
Specifically, in step 08, when a light source is present in the preview image, the brightness value within the peripheral preset range of the light source is detected. The determination of whether the light source exists in the preview image may be performed by detecting whether a small range area with extremely high brightness exists in the preview image, for example, the brightness value of the pixel in the small range area reaches the maximum, or the RGB pixel values of the pixels in the small range area are all close to 255, and the like, which is not limited herein. When the light source exists in the preview image, detecting the brightness value in a preset range around the light source, for example, detecting the brightness value of pixels in a preset number of pixel ranges around the light source with the position of the light source as the center, where the preset range may be set by a user or may be set according to the accuracy of the image that the user needs to capture currently, where the preset range may be larger when the accuracy requirement is higher, and the preset range may be smaller when the accuracy requirement is lower. Of course, when it is determined that the light source is not present in the preview image, it is possible to determine whether or not the lens 10 is dirty in steps 02 and 03.
For example, as shown in FIG. 8, in the preview image P4, the brightness value of the pixel where the light source L is located is 255, the pixel where the strip region D4 is located is connected to the pixel where the light source L is located, the brightness value of the pixel where the strip region D4 is located is in the range of 200 to 245, and the brightness value of the pixel where the strip region D4 is located is in the range of 100 to 150, the difference between the brightness value of the pixel where the strip region D4 and the brightness value of the pixel where the rest of the preset range is above the brightness threshold, and the strip region D4 is generated due to the existence of dirt in the lens 10. therefore, the presence of dirt in the lens 10 is likely to be prompted.
Referring to fig. 9, in some embodiments, the image processing method further includes:
010: calibrating the brightness value of the banded region according to the brightness values of the rest regions; and/or
011: the color of the banded regions is calibrated against the color of the remaining regions.
As described above, the strip-shaped region is likely to be generated due to the existence of dirt, the information of the strip-shaped region is likely to be distorted, and since the distance between the rest region in the preset range and the strip-shaped region is short, the information of the strip-shaped region obtained according to the information of the rest region is more accurate, so that the processed preview image is more accurate.
In step 010, calibrating the brightness value of the banded region according to the brightness values of the rest regions, specifically, detecting an average value of the brightness values of all pixels of the rest regions, and replacing the brightness value of the pixels of the banded region with the average value; alternatively, the luminance values of the pixels in the remaining region adjacent to the strip region may be substituted for the luminance values of the pixels in the strip region; alternatively, the brightness difference between the light source and the remaining region may be calculated, and the brightness value of the strip region may be calculated according to a decreasing rule in a direction away from the light source, so as to make the brightness value of the strip region closer to a real value, and how to calibrate the brightness value of the strip region by using the brightness values of the remaining region may also have other specific manners, which are not limited herein.
In step 011, calibrating the color of the strip-shaped region according to the color of the remaining region, specifically, replacing the color of the pixels in the strip-shaped region with the color of the pixels in the remaining region adjacent to the strip-shaped region; or, the color of the pixel appearing the most times is taken from the remaining area, and the color of the pixel is substituted for the color of the pixel in the strip area, so that the color of the strip area is closer to the real color, for example, the light source may be the sun, the strip area may be the color of a blue sky, the color of the remaining area may also be the color of a blue sky, the real color of the strip area can be well restored through the color of the remaining area, and how to use the color of the remaining area to calibrate the color of the strip area may also be other specific manners, which are not limited herein.
In the example shown in fig. 8 and 10, the preview image P4 is processed by the image processing method to obtain the preview image P4 ', and in the preview image P4', the brightness and color of the strip-shaped region D4 (the region surrounded by the dotted line in fig. 10 and excluding the light source L) are closer to the brightness and color of the surroundings, so that the influence of dirt on the image quality is eliminated, and the reality of the scene to be displayed is improved.
Referring to fig. 11, the present application also provides a non-volatile computer-readable storage medium 200, where the computer-readable storage medium 200 contains computer-executable instructions 201, and when the computer-executable instructions 201 are executed by one or more processors 300, the processors 300 are caused to execute the image processing method according to any embodiment of the present application.
For example, when the computer-executable 201 instructions are executed by the processor 300, the processor 300 may be configured to perform the steps of:
01: acquiring preview images of a plurality of different scenes;
02: detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; and
03: when the coincidence degree between the low-brightness regions of the plurality of preview images is higher than a preset threshold value, a notification that the lens 10 is dirty is generated.
As another example, when the computer-executable instructions 201 are executed by the processor 300, the processor 300 may be configured to perform the steps of:
08: detecting a brightness value within a preset range of the periphery of a light source when the light source exists in the preview image; and
09: when a banded region connected with the light source exists in the preset range and the brightness value of the banded region is greater than the brightness value preset brightness threshold value of the rest regions in the preset range, a prompt message that the lens 10 is dirty is generated.
In the computer-readable storage medium 200 according to the embodiment of the present application, by detecting a low brightness region in a plurality of preview images, if the degree of coincidence of the low brightness region in the plurality of preview images is higher than a preset threshold, it is determined that light in the low brightness region is formed due to blocking of dirt, and at this time, a notification message indicating that the lens 10 is dirty is generated to notify the user.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (11)

1. An image processing method, comprising:
acquiring preview images of a plurality of different scenes;
detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; and
and when the coincidence degree of the low-brightness areas of the preview images is higher than a preset threshold value, generating prompt information that the lens is dirty.
2. The image processing method according to claim 1, characterized in that the image processing method further comprises:
dividing the preview image into a plurality of brightness areas according to a plurality of preset brightness value intervals, wherein each brightness value interval is associated with a corresponding adjustment proportion; and
and adjusting the brightness value of the corresponding brightness area according to the adjustment proportion.
3. The image processing method according to claim 1 or 2, characterized in that the image processing method further comprises:
improving the contrast of the preview image; and/or
And improving the color saturation of the preview image.
4. The image processing method according to claim 1, characterized in that the image processing method further comprises:
detecting a brightness value within a preset range of the periphery of a light source when the light source exists in the preview image; and
and when a banded region connected with the light source exists in the preset range and the brightness value of the banded region is greater than the brightness value preset brightness threshold value of the rest regions in the preset range, generating a prompt message that the lens is dirty.
5. The image processing method according to claim 4, characterized in that the image processing method further comprises:
calibrating the brightness value of the banded region according to the brightness values of the rest regions; and/or
Calibrating the color of the strip-shaped area according to the color of the rest area.
6. A terminal, comprising a lens and a processor, wherein the processor is configured to:
acquiring preview images of a plurality of different scenes;
detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; and
and when the coincidence degree of the low-brightness areas of the preview images is higher than a preset threshold value, generating prompt information that the lens is dirty.
7. The terminal of claim 6, wherein the processor is further configured to:
dividing the preview image into a plurality of brightness areas according to a plurality of preset brightness value intervals, wherein each brightness value interval is associated with a corresponding adjustment proportion; and
and adjusting the brightness value of the corresponding brightness area according to the adjustment proportion.
8. The terminal of claim 7, wherein the processor is further configured to:
improving the contrast of the preview image; and/or
And improving the color saturation of the preview image.
9. The terminal of claim 6, wherein the processor is further configured to:
detecting a brightness value within a preset range of the periphery of a light source when the light source exists in the preview image; and
and when a banded region connected with the light source exists in the preset range and the brightness value of the banded region is greater than the brightness value preset brightness threshold value of the rest regions in the preset range, generating a prompt message that the lens is dirty.
10. The terminal of claim 9, wherein the processor is further configured to:
calibrating the brightness value of the banded region according to the brightness values of the rest regions; and/or
Calibrating the color of the strip-shaped area according to the color of the rest area.
11. A non-transitory computer-readable storage medium containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the image processing method of any one of claims 1 to 5.
CN202010157432.9A 2020-03-09 2020-03-09 Image processing method, terminal and computer readable storage medium Active CN111405177B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010157432.9A CN111405177B (en) 2020-03-09 2020-03-09 Image processing method, terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010157432.9A CN111405177B (en) 2020-03-09 2020-03-09 Image processing method, terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111405177A true CN111405177A (en) 2020-07-10
CN111405177B CN111405177B (en) 2021-09-24

Family

ID=71413936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010157432.9A Active CN111405177B (en) 2020-03-09 2020-03-09 Image processing method, terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111405177B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114866699A (en) * 2022-05-23 2022-08-05 Oppo广东移动通信有限公司 Image processing method and device, computer readable storage medium and electronic device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130033585A1 (en) * 2011-08-04 2013-02-07 Aptina Imaging Corporation Systems and methods for color compensation in multi-view video
CN104509090A (en) * 2012-07-27 2015-04-08 歌乐牌株式会社 Vehicle-mounted image recognition device
CN107404647A (en) * 2016-05-20 2017-11-28 中兴通讯股份有限公司 Camera lens condition detection method and device
CN107613192A (en) * 2017-08-09 2018-01-19 深圳市巨龙创视科技有限公司 A kind of Digital Image Processing algorithm based on video camera module
CN107945158A (en) * 2017-11-15 2018-04-20 上海摩软通讯技术有限公司 A kind of dirty method and device of detector lens
CN109472738A (en) * 2018-10-26 2019-03-15 深圳市商汤科技有限公司 Image irradiation correcting method and device, electronic equipment and storage medium
CN110738629A (en) * 2018-07-02 2020-01-31 中兴通讯股份有限公司 lens contamination detection method, terminal and computer readable storage medium
CN110766679A (en) * 2019-10-25 2020-02-07 普联技术有限公司 Lens contamination detection method and device and terminal equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130033585A1 (en) * 2011-08-04 2013-02-07 Aptina Imaging Corporation Systems and methods for color compensation in multi-view video
CN104509090A (en) * 2012-07-27 2015-04-08 歌乐牌株式会社 Vehicle-mounted image recognition device
CN107404647A (en) * 2016-05-20 2017-11-28 中兴通讯股份有限公司 Camera lens condition detection method and device
CN107613192A (en) * 2017-08-09 2018-01-19 深圳市巨龙创视科技有限公司 A kind of Digital Image Processing algorithm based on video camera module
CN107945158A (en) * 2017-11-15 2018-04-20 上海摩软通讯技术有限公司 A kind of dirty method and device of detector lens
CN110738629A (en) * 2018-07-02 2020-01-31 中兴通讯股份有限公司 lens contamination detection method, terminal and computer readable storage medium
CN109472738A (en) * 2018-10-26 2019-03-15 深圳市商汤科技有限公司 Image irradiation correcting method and device, electronic equipment and storage medium
CN110766679A (en) * 2019-10-25 2020-02-07 普联技术有限公司 Lens contamination detection method and device and terminal equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114866699A (en) * 2022-05-23 2022-08-05 Oppo广东移动通信有限公司 Image processing method and device, computer readable storage medium and electronic device

Also Published As

Publication number Publication date
CN111405177B (en) 2021-09-24

Similar Documents

Publication Publication Date Title
CA2996751C (en) Calibration of defective image sensor elements
US9646397B2 (en) Image processing apparatus and image processing method
JP4403397B2 (en) User interface providing device
EP3343911A1 (en) Image signal processing method and system
US8229217B2 (en) Image processing method and apparatus, image processing program and medium storing this program
JP7369175B2 (en) Image processing device and its control method and program
CN107635103B (en) Image processing method, mobile terminal and medium product
US7817190B2 (en) Method and apparatus for processing an image exposed to backlight
US10992854B2 (en) Image processing apparatus, imaging apparatus, image processing method, and storage medium
US20120127336A1 (en) Imaging apparatus, imaging method and computer program
CN106534677B (en) Image overexposure optimization method and device
US10070019B2 (en) Method of reducing color moire and image processing apparatus using the method
JP7136956B2 (en) Image processing method and device, terminal and storage medium
US20110286680A1 (en) Medium storing image processing program, image processing device, and image processing method
CN108307098A (en) Fisheye camera shadow correction parameter determination method, bearing calibration and device, storage medium, fisheye camera
JP2011041056A (en) Imaging apparatus and imaging method
CN111405177B (en) Image processing method, terminal and computer readable storage medium
WO2011000392A1 (en) Method and camera system for improving the contrast of a camera image
JP2013162339A (en) Imaging apparatus
US20100245590A1 (en) Camera sensor system self-calibration
TW202218403A (en) Correction of color tinted pixels captured in low-light conditions
US7397968B2 (en) System and method for tone composition
JP2008305122A (en) Image-processing apparatus, image processing method and program
US11805326B2 (en) Image processing apparatus, control method thereof, and storage medium
KR101840754B1 (en) Brightness section unit image quality compensating system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant