CN113076997A - Lens band fog identification method, camera module and terminal equipment - Google Patents
Lens band fog identification method, camera module and terminal equipment Download PDFInfo
- Publication number
- CN113076997A CN113076997A CN202110353451.3A CN202110353451A CN113076997A CN 113076997 A CN113076997 A CN 113076997A CN 202110353451 A CN202110353451 A CN 202110353451A CN 113076997 A CN113076997 A CN 113076997A
- Authority
- CN
- China
- Prior art keywords
- defogging
- detection
- image data
- gray
- fog
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B08—CLEANING
- B08B—CLEANING IN GENERAL; PREVENTION OF FOULING IN GENERAL
- B08B5/00—Cleaning by methods involving the use of air flow or gas flow
- B08B5/02—Cleaning by the force of jets, e.g. blowing-out cavities
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/55—Details of cameras or camera bodies; Accessories therefor with provision for heating or cooling, e.g. in aircraft
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/30—Noise filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/52—Elements optimising image sensor operation, e.g. for electromagnetic interference [EMI] protection or temperature control by heat transfer or cooling elements
Abstract
The application discloses lens band fog recognition method, camera module and terminal equipment, through statistics pixel total amount and the characteristic pixel quantity in the characteristic gray scale value scope, and obtain the defogging coefficient according to pixel total amount and characteristic pixel quantity, compare the defogging coefficient with preset defogging threshold value, and then carry out defogging according to the comparison result, the defogging processing method is simple, need not a large amount of data analysis, easily obtains current fog volume condition fast high-efficiently so that in time carry out defogging operation. And moreover, the mode of obtaining the number of pixels corresponding to the gray-scale value by adopting gray-scale conversion and filtering processing are adopted, so that the interference factor of the obtained data is small, the obtained data result is stable, and the method has good reference comparison value and is beneficial to carrying out accurate defogging operation. Terminal equipment and module of making a video recording adopt the mirror head area fog recognition method in this application can be convenient for the self-checking with high efficiency ground execution defogging processing task, provide the clear visual sensation of user, also need not the user to participate in the defogging processing task.
Description
Technical Field
The application relates to the technical field of image processing, in particular to a lens fog identification method, a camera module and a terminal device.
Background
The water floating in the atmosphere is often easily gathered on the surface of the lens of the camera due to the change of the environmental temperature, so that the picture shot by the camera device is white. For example, when a vehicle is frequently outdoors and travels in a rainy and foggy weather or when the temperature difference between day and night is large, frost and fog occur in the vehicle-mounted camera module mounted on the vehicle, which affects the judgment of the driver on the peripheral environment and has a potential safety hazard.
Disclosure of Invention
In order to solve the above problems, the present application provides a lens fog identification method, a camera module and a terminal device.
In a first aspect, a method for identifying mirror band fog provided in an embodiment of the present application includes:
and acquiring target image data, and performing gray level conversion on the target image data to obtain target gray level image data.
And filtering the target gray level image data to obtain target filtering image data.
The total number of pixels of the target filtered image data is obtained, and the number of characteristic pixels in the range of the characteristic gray-scale value of the target filtered image data is obtained.
And obtaining a defogging coefficient of the target image data according to the number of the characteristic pixels and the total amount of the pixels, comparing the defogging coefficient with a preset defogging threshold value to obtain a first comparison result, and judging whether to execute defogging processing according to the first comparison result.
According to the lens band fog identification method, the total pixel amount and the number of the characteristic pixels in the characteristic gray scale value range are counted, the defogging coefficient is obtained according to the total pixel amount and the number of the characteristic pixels, the defogging coefficient is compared with the preset defogging threshold value, and then the defogging processing is executed according to the comparison result. And moreover, the mode of obtaining the number of pixels corresponding to the gray-scale value by adopting gray-scale conversion and filtering processing are adopted, so that the interference factor of the obtained data is small, the obtained data result is stable, and the method has good reference comparison value and is beneficial to carrying out accurate defogging operation.
In some embodiments, the range of characteristic grayscale values is obtained by: converting a plurality of detection images obtained in various fog quantities into gray level images in advance, and filtering to obtain corresponding detection filtering image data; obtaining the pixel number and the total pixel number of each detection filtering image data in the range of the multiple groups of gray-scale values to obtain multiple groups of detection coefficients; and comparing the detection coefficients corresponding to the detection filtering image data in the same gray-scale value range to obtain a second comparison result, and obtaining a characteristic gray-scale value range according to the second comparison result.
Based on the above embodiment, the characteristic gray-scale value range is obtained by comparing the pixel number variation conditions in the multiple groups of gray-scale value ranges, so as to find out the corresponding pixel number variation rule in each gray-scale value range, and improve the reliability of the obtained characteristic gray-scale value range applied to the actual defogging process.
In some embodiments, the defogging coefficient is obtained according to the ratio of the number of characteristic pixels to the total number of pixels; the detection coefficient is obtained from a ratio of the number of pixels in each gray-scale value range of the detection-filtered image data to the total number of pixels.
Based on the above-described embodiment, the method of obtaining the defogging coefficients and the detection coefficients is made simple so as to obtain a plurality of detection coefficients of a plurality of detection filter image data quickly and efficiently, and the efficiency of obtaining the defogging coefficients when the defogging operation is performed is improved to improve the defogging efficiency.
In some embodiments, the plurality of fog levels includes fog levels of 0% and at least one set of fog levels greater than 0%.
Based on the embodiment, the detection image is shot when the fog amount is 0% and is used for simulating the shooting state in the non-fog state, and the detection image is shot when the fog amount is larger than 0% and is used for simulating the shooting states in various fog states, so that the data obtained in the fog state and the non-fog state can be compared conveniently, and the sample capacity is enriched.
In some embodiments, the obtaining of the characteristic gray scale value range further comprises:
and comparing the plurality of groups of detection coefficients, and selecting a gray level value range corresponding to the detection coefficient obtained when the fog amount is more than 0% and the detection coefficient obtained when the fog amount is more than 0% as a characteristic gray level value range.
Based on the above embodiment, based on the pixel number distribution condition corresponding to each gray-scale value, the gray-scale value range with small fluctuation of the pixel number can be conveniently and effectively screened out in the above manner, so as to obtain the characteristic gray-scale value range.
In some embodiments, the defogging threshold value is a detection coefficient of a detection image obtained in a 0% fog state at a characteristic grayscale value range.
Based on the above embodiment, it can be understood that the fog amount of 0% is a state that the lens of the image pickup assembly needs to be used normally and is located for a long time, and therefore, the detection coefficient obtained in the state of the fog amount of 0% can be used as a blank parameter, so that the lens can be identified quickly when an abnormal condition such as fog occurs.
In some embodiments, the plurality of detection images are a plurality of detection images obtained under different ambient brightness states.
Based on the above-described embodiment, to sufficiently simulate a situation that may occur when an image is captured, a rich sample capacity, and to improve the reliability of the obtained defogging threshold value and the characteristic grayscale value range.
In some embodiments, the ambient brightness is obtained by an exposure time or gain provided by a sensing component that assists in capturing the image; or the ambient brightness is obtained by an ISO value, an EV value, an AE value, or an ISP gain value provided by the photosensitive element.
Based on the embodiment, the brightness value synchronous with the shooting target image data or the detection image can be obtained by adopting the method, the brightness condition can not be obtained in advance or in delay, and the reliability and the effectiveness of enriching the sample capacity by adopting the brightness value are ensured. The method for acquiring the brightness is simple and easy, and other auxiliary equipment is not needed.
In some embodiments, the defogging processing method further comprises a step of acquiring target image data by shooting with a camera, and the defogging processing method in the step of determining whether to execute defogging processing according to the first comparison result comprises heating the camera.
Based on the embodiment, the method for directly heating the camera can remove frost on the surface of the lens of the camera, and the defogging method is direct and efficient.
In a second aspect, an embodiment of the present application provides a camera module, where the camera module includes a camera module and a camera processor, the camera module is configured to obtain target image data, and the camera processor is configured to execute the above-mentioned steps of the lens band fog recognition method to perform defogging processing on the camera module.
Based on the module of making a video recording of this application embodiment, the module of making a video recording adopts as above when having the fog state to use the lens area fog recognition method can get rid of the module of making a video recording attached frost fog fast high-efficiently, improves the definition that the module of making a video recording shot the image.
In a third aspect, an embodiment of the present application provides a terminal device, where the terminal device includes a camera, a processor, and a memory, where a computer program operable on the processor is provided in the memory, and the processor executes the computer program according to the steps of the lens fogging identification method described above.
According to the terminal equipment based on the embodiment of the application, the terminal equipment can perform self-checking so as to execute the defogging processing task quickly and efficiently by adopting the lens fogging identification method, clear visual perception is provided for a user, and the user does not need to participate in the defogging processing task.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the method for identifying lens fogging as described above is implemented.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is an implementation environment diagram of a lens band fog identification method for a terminal device according to an embodiment of the present application;
FIG. 2 is a side view of an automobile provided in accordance with an embodiment of the present application;
fig. 3 is an implementation environment diagram of a camera module according to an embodiment of the present application;
FIG. 4 is a flow chart of a method for identifying lens band fog according to one embodiment of the present application;
FIG. 5 is a flowchart of a method for obtaining a characteristic gray scale value range according to an embodiment of the present application;
FIG. 6 is a graph illustrating a variation of pixel numbers corresponding to gray-scale values of a test image before filtering according to an embodiment of the present disclosure;
FIG. 7 is a graph illustrating a variation of pixel numbers corresponding to gray-scale values of the detected image of FIG. 6 after filtering;
fig. 8 is a histogram of statistical defogging coefficients corresponding to the different fog levels in fig. 7 in the range of characteristic grayscale values.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The frost fog on the surface of the lens can be effectively removed by heating the camera, so that a user can obtain a clear shot image in time. In the related art, the heating operation of the camera requires that the user manually operate the defogging device to remove the fog on the lens after observing that the captured image is blurred due to the fog, so that the user is distracted and inconvenient to operate. If the heating device is continuously started to remove frost and fog, the camera module is in a hot state for a long time, so that the service life of the camera module is influenced.
Therefore, to solve the above problems, embodiments of the present invention provide a lens band fog identification method, a camera module and a terminal device. The lens band fog identification method is used for comparing and analyzing shot images so as to expel fog on a lens of the camera in real time and improve the convenience of using and operating the camera.
As shown in fig. 1, the method for identifying lens band fog in the present invention can be applied to a terminal device 100, where the terminal device 100 includes, but is not limited to, a tablet computer, a notebook computer, a wearable device, an automobile, and a flight device, and the like, and the terminal device 100 has an image capturing function.
The terminal device 100 may include a camera 110, a processor 120, and a memory 130.
The camera 110 may be a visible light camera or an infrared light camera, and the like, and is configured to transmit image data obtained by shooting to the memory 130.
The memory 130 includes at least one type of readable storage medium. The at least one type of readable storage medium may be a non-volatile storage medium such as a flash memory, a hard disk, a multimedia card, a card-type memory, and the like. In some embodiments, the readable storage medium may be an internal storage unit of the terminal device 100, such as a hard disk of the terminal device 100. In other embodiments, the readable storage medium may also be an external memory 130 of the terminal device 100, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 100.
In the present embodiment, the readable storage medium of the memory 130 is used to store a computer program for executing the lens band fog recognition method. The memory 130 may also be used to temporarily store data that has been output or is to be output
The processor 120 may be, in some embodiments, a Central Processing Unit (CPU), microprocessor or other data Processing chip for executing program codes stored in the memory 130 or Processing data, such as a program for performing image defogging.
The terminal device 100 may further include a defogging component 140, and the defogging component 140 may include a blower or a heater, etc., which is installed near the camera 110, the blower being used for receiving signals sent by the processor 120 and sending air to the lens of the camera 110 to perform defogging processing, or the heater being used for receiving signals sent by the processor 120 and heating the lens of the camera 110 to perform defogging processing.
The terminal device 100 may further comprise a display 150, the display 150 may also be referred to as a display screen or a display unit. In some embodiments, the display device can be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an Organic Light-Emitting Diode (OLED) display, and the like. The display 150 is used to display information processed in the terminal device 100 and to display a user interface for visualization.
As shown in fig. 2, taking the terminal device 100 as the automobile 10 as an example, the automobile 10 includes an automobile body 11, a processor 120, a memory 130 and a display 150 are installed in the automobile body 11, a camera 110 is installed at the rear end of the automobile body 11 and electrically connected to the processor 120, the memory 130 and the display 150, and a defogging component 140 is installed on the automobile body 11 and located near the camera 110 to timely remove frost on a lens of the camera 110.
As shown in fig. 3, the method for identifying lens fog in the present invention can be applied to a camera module 200, and the camera module 200 can be applied to a camera, a video recorder, and other devices with camera function, but is not limited to the above. The camera module 200 comprises a camera module 210 and a camera processor 220, the camera module 200 further comprises an internal memory 230 installed inside the camera module, a readable storage medium of the internal memory 230 is used for storing a computer program for executing the lens band fog recognition method, and the camera processor 220 is a processor installed inside the camera module 200 and used for running program codes or processing data stored in the internal memory 230, so that the camera module 200 can execute the defogging operation by itself without using a processor in an external device such as a terminal device. The camera module 200 may further include a heating component 240, and the heating component 240 is configured to receive a signal sent by the camera processor 220 and heat the camera module 210 to remove frost and fog on the lens of the camera module 210.
The processor 120 implements, as shown in fig. 4, the following steps S101, S102, S103, and S104 when executing the image defogging program stored in the memory 130 or the image capture processor 220 executes the image defogging program stored in the internal memory 230.
Step S101, obtaining target image data, and carrying out gray level conversion on the target image data to obtain target gray level image data.
Specifically, the target image data may be captured by the camera 110 of the terminal device 100 or by the camera component 210 of the camera module 200. The camera 110 or the camera module 210 may be a visible light camera module, and target gray-scale image data is obtained by performing gray-scale conversion on target image data obtained by shooting with the visible light camera module. In other embodiments, the camera 110 or the camera module 210 may also be an infrared light camera module 210, and the infrared light camera module 210 may directly capture the target grayscale image data.
And S102, filtering the target gray-scale image data to obtain target filtered image data.
The filtering mode of the target gray level image data is edge filtering, specifically, edge filtering is performed on the target gray level image data through a canny algorithm or a sobel algorithm, and noise is filtered.
Step S103, acquiring the total number of pixels of the target filtered image data, and acquiring the number of characteristic pixels within the range of the characteristic gray-scale value of the target filtered image data.
The total number of pixels for obtaining the target filtered image data is the total number of pixels for obtaining the target filtered image data within the full gray-scale value range of 0 to 255. The characteristic gray scale value range is a certain gray scale value range from 0 to 255.
In some other embodiments, the number of pixels in a certain gray-scale value range M from 0 to 255 may be selected as the total number of pixels, a certain gray-scale value range M in the gray-scale value range M is selected, the gray-scale value range M is used as the characteristic gray-scale value range, and the number of pixels in the gray-scale value range M is obtained as the characteristic number of pixels. For example, after filtering, if the number of pixels in the range of the grayscale values 0-10 or 240-255 is less or close to 0, the number of pixels in the range of 10-240 is selected as the total number of pixels, and then a certain grayscale value range in the range of the grayscale values 10-240 is selected as the characteristic grayscale value range for counting the number of characteristic pixels.
And step S104, acquiring a defogging coefficient of the target image data according to the number of the characteristic pixels and the total amount of the pixels, comparing the defogging coefficient with a preset defogging threshold value to acquire a first comparison result, and judging whether to execute defogging processing according to the first comparison result.
The preset defogging threshold may be pre-stored in the memory 130 or the internal memory 230, and the processor 120 or the image pickup processor 220 calls the preset defogging threshold to compare with the currently obtained defogging coefficient when performing the numerical comparison, so as to obtain a first comparison result. According to the first comparison result, the processor 120 executes the program codes or the processing data stored in the memory 130 or the internal memory 230 to perform the defogging process.
The defogging coefficient can be obtained according to the ratio of the number of the characteristic pixels to the total amount of the pixels, so that the numerical relation between the number of the characteristic pixels and the total amount of the pixels can be rapidly obtained. Of course, in other embodiments, the defogging coefficient may be the sum or the difference of the number of characteristic pixels and the total number of pixels.
According to the method for identifying the fog of the lens band, the total pixel amount and the number of the characteristic pixels in the characteristic gray scale value range are counted, the defogging coefficient is obtained according to the total pixel amount and the number of the characteristic pixels, the defogging coefficient is compared with the preset defogging threshold value, and then the defogging processing is executed according to the comparison result. And the mode of obtaining the number of pixels corresponding to the gray-scale value is adopted, so that the interference factor of the obtained data is small, the obtained data result is stable, and the reference comparison value is good.
As shown in fig. 5, specifically, the characteristic gray scale value range is obtained by:
step S201, converting a plurality of detection images obtained in various fog states into gray-scale images in advance, and filtering the gray-scale images to obtain corresponding detection filtered image data.
The filtering process performed on the plurality of detection images converted into the gray-scale images is the same as the filtering process performed on the target gray-scale image data. As shown in fig. 6 and 7, the number of pixels corresponding to each gray-scale value in fig. 6 after filtering is transformed as shown in fig. 7.
Step S202, obtaining the number of pixels and the total number of pixels of each detection filter image data in the range of the multiple sets of gray-scale values to obtain multiple sets of detection coefficients.
Wherein, for each group of detection filtering image data, the number of groups in the selected gray scale value range can include at least one group. Furthermore, more groups of gray scale value ranges such as two groups, three groups, four groups, five groups, six groups and the like can be selected, and the number of pixels in different gray scale value ranges can be counted, so that the pixel number distribution rule in each gray scale value range can be comprehensively explored. For example, four consecutive gray scale value ranges of 50-100, 100-150, 150-200 and 200-250 can be selected from the range of gray scale values 0-255, or four staggered gray scale value ranges of 50-150, 70-170, 60-200 and 80-220 can be selected, and different gray scale value ranges of different groups can be flexibly set according to specific situations.
Step S203, comparing the multiple detection coefficients corresponding to the multiple detection filtered image data in the same gray level value range to obtain a second comparison result, and obtaining a characteristic gray level value range according to the second comparison result.
Specifically, the number of the selected multiple groups of detection filter image data is recorded as n, where n is an integer greater than or equal to 1, the number of pixels in multiple gray scale ranges of m1, m2, m3, and m4 … … in each group of detection filter image data is obtained, the number of pixels in multiple gray scale ranges of m1, m2, m3, or m4 in each group of detection filter image data is compared one by one, the change of the number of pixels in each gray scale range of each group of detection filter image data is counted, a second comparison result is obtained, and a gray scale value range with small pixel number change fluctuation can be selected as a characteristic gray scale value range. For example, when the fluctuation range of the number of pixels of the n pieces of detection filtered image data in the gray scale range of m2 is small, the gray scale range of m2 is taken as the characteristic gray scale value range.
When the defogging processing is performed, the defogging coefficient can be obtained according to the ratio of the number of the characteristic pixels to the total number of the pixels, and correspondingly, the detection coefficient can be obtained according to the ratio of the number of the pixels in each gray-scale value range of the detection filtering image data to the total number of the pixels. Alternatively, when the defogging coefficients are obtained based on the sum or difference between the number of characteristic pixels and the total number of pixels, the detection coefficients are correspondingly obtained based on the sum or difference between the number of pixels in each gray-scale value range of the detection filter image data and the total number of pixels, so as to obtain a plurality of detection coefficients of the plurality of detection filter image data quickly and efficiently. In some other embodiments, when the characteristic gray-scale value range is obtained, the second comparison result may also be obtained by counting the detection coefficients of each group of detection filtered image data in each gray-scale range, and the gray-scale value range with small variation fluctuation of the detection coefficients is selected as the characteristic gray-scale value range.
A plurality of detection image acquisition modes of various fog quantity states can be carried out indoors, detection environments of different fog quantity states are created by controlling the air humidity and the temperature of the indoor environment, and then the same object to be shot is shot in the detection environments of the different fog quantity states so as to obtain a plurality of detection images of various fog quantity states.
To more fully simulate the weather conditions encountered by the product during use, the plurality of selected fog conditions includes a fog condition having a fog amount of 0% and at least one set of fog conditions having a fog amount greater than 0%. The method comprises the steps of selecting a detection image obtained by shooting when the fog amount is 0% for simulating the shooting state in the non-fog state, selecting a detection image obtained by shooting when the fog amount is more than 0% for simulating the shooting state in various fog states, so that data obtained in the fog and non-fog states can be compared conveniently, and the sample capacity is enriched.
It can be understood that the number of pixels corresponding to different gray-scale values is different. For example, as shown in fig. 7, in order to obtain a distribution diagram of the number of pixels of the gray scale map of the detection image after filtering in three states of 0%, 50%, and 100% of fog amount, as can be seen from fig. 7, in the detection image data obtained by capturing the same object, the number of pixels corresponding to 0% of fog amount may be greater than, equal to, or less than the number of pixels corresponding to more than 0% of fog amount, and correspondingly, the number of pixels corresponding to 0% of fog amount obtained by statistics may also be greater than, equal to, or less than the number of pixels corresponding to more than 0% of fog amount in different gray scale value ranges. In some embodiments, the obtaining of the characteristic gray scale value range further comprises: and comparing the plurality of groups of detection coefficients, and selecting a gray level value range corresponding to the detection coefficient obtained when the fog amount is more than 0% and the detection coefficient obtained when the fog amount is more than 0% as a characteristic gray level value range. Based on the pixel quantity distribution condition corresponding to each gray-scale value, the gray-scale value range with small pixel quantity change fluctuation can be conveniently and effectively screened out according to the mode.
It is understood that the fog amount of 0% is a state where the camera 110 or the camera module 210 is required for a long time in normal use, and therefore, the detection coefficient obtained in the state of the fog amount of 0% can be used as a blank parameter. In some embodiments, the defogging threshold value may be obtained from a detection coefficient of a detection image obtained with the fog amount in the 0% state at a range of characteristic grayscale values. Specifically, when the defogging threshold is obtained, the detection coefficients of the detection images obtained under the condition that the fog amounts are 0% in the characteristic gray level value range can be compared, differences exist among the detection coefficients obtained under different detection conditions, the average value and the standard deviation of the detection coefficients can be counted, the average value can be directly used as the defogging threshold, one standard deviation can be added on the basis of the average value or two standard deviations can be deducted on the basis of the average value, and the selection mode of the specific defogging threshold can be obtained according to the numerical conditions obtained in the actual experiment results.
The method of obtaining the defogging threshold value will be described below with reference to the number of pixels of a certain detected image to be obtained. For example, the gray scale range 70-100 is used as the characteristic gray scale range.
S301, in the statistical chart 7, the number of pixels corresponding to the fog amount of 0% is 66943, the number of pixels corresponding to the fog amount of 50% is 81071, and the number of pixels corresponding to the fog amount of 100% is 95999 in the gray-scale value range of 70-100.
S302, in the range of the gray-scale value of 0-255, the total number of pixels corresponding to 0% of fog is 126159, the total number of pixels corresponding to 50% of fog is 127414, and the total number of pixels corresponding to 100% of fog is 126159.
S303, a detection coefficient corresponding to 0% of fog amount of 53.06%, a detection coefficient corresponding to 50% of fog amount of 63.63%, and a detection coefficient corresponding to 100% of fog amount of 75.07% were calculated from the above data, and are counted in fig. 8.
And S304, obtaining a standard deviation according to the multiple groups of statistical defogging coefficients corresponding to the characteristic gray scale value range of 70-100. On the basis of the defogging coefficient of 53.06% corresponding to the defogging amount of 0%, the corrected defogging coefficient is 55.00% by adding the standard deviation, and then 55.00% is used as a defogging threshold value to be applied to the actual defogging process.
In order to further enrich the sample capacity and improve the comprehensiveness of the detection method, in some embodiments, the multiple detection images are multiple detection images obtained under different ambient brightness states, so as to sufficiently simulate the environmental change condition of the camera 110 or the camera module 210 during the use process, and improve the reliability of the obtained defogging threshold value and the obtained characteristic gray value range. Specifically, the different environmental luminances may include luminance conditions in a plurality of time periods such as early morning, noon, afternoon, evening, early morning, and the like, and in each luminance condition, a plurality of groups of detection images may be acquired and added to the detection sample.
In some embodiments, the ambient brightness is obtained by an exposure time or gain provided by a sensing component that assists in capturing the image; or the ambient brightness is obtained by an ISO (sensitivity) Value, an EV (Exposure Value), an AE (Automatic Exposure) Value or an ISP (Image Signal Processor) gain Value provided by the photosensitive element, and the brightness Value synchronized with the shooting target Image data or the detection Image can be obtained by the method, so that the brightness condition can not be obtained in advance or in a delayed manner, and the reliability and the effectiveness of the sample capacity enriched by the light brightness Value are ensured. The method for acquiring the brightness is simple and easy, and other auxiliary equipment is not needed.
In some embodiments, the defogging processing method further includes a step of capturing target image data by using the camera 110, and the determination of whether to execute the defogging processing method in the defogging processing step includes heating the camera 110 according to the first comparison result. Specifically, the processor 120 or the camera processor 220 may be configured to periodically detect and obtain the defogging coefficient of the current state, and compare the defogging coefficient with the defogging threshold. When the processor 120 or the image pickup processor 220 detects that the current defogging coefficient is greater than the defogging threshold value, the processor 120 or the image pickup processor 220 controls the heating device to heat and evaporate the camera 110 to remove the frost on the surface of the camera 110; when the processor 120 or the image pickup processor 220 detects that the current defogging coefficient is less than or equal to the defogging threshold value, the processor 120 controls the heating device to stop heating or controls the heating device to be in the original unheated state.
In other embodiments, a program for defogging the image is also stored in the memory 130 or the internal memory 230, and the processor 120 or the image pickup processor 220 executes the program for defogging the image to individually process the image so as to improve the sharpness of the image obtained by shooting. The processor 120 or the image pickup processor 220 can also execute a program for defogging the image on the basis of controlling the heating device to heat the lens, so as to further improve the definition of the image obtained by shooting.
The same or similar reference numerals in the drawings of the present embodiment correspond to the same or similar components; in the description of the present application, it is to be understood that if there is an orientation or positional relationship indicated by the terms "upper", "lower", "left", "right", etc. based on the orientation or positional relationship shown in the drawings, it is only for convenience of description and simplification of description, but it is not intended to indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and therefore, the terms describing the positional relationship in the drawings are only for illustrative purposes and are not to be construed as limitations of the present patent, and specific meanings of the above terms may be understood by those skilled in the art according to specific situations.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.
Claims (12)
1. A lens fog identification method is characterized by comprising the following steps:
acquiring target image data, and performing gray level conversion on the target image data to obtain target gray level image data;
filtering the target gray level image data to obtain target filtering image data;
acquiring the total pixel amount of the target filtering image data and acquiring the number of characteristic pixels in the characteristic gray-scale value range of the target filtering image data;
and obtaining a defogging coefficient of the target image data according to the number of the characteristic pixels and the total amount of the pixels, comparing the defogging coefficient with a preset defogging threshold value to obtain a first comparison result, and judging whether to execute defogging processing according to the first comparison result.
2. The lens fogging identification method according to claim 1, wherein the characteristic gray-scale value range is obtained by: converting a plurality of detection images obtained in various fog quantities into gray level images in advance, and filtering to obtain corresponding detection filtering image data; obtaining the pixel number and the total pixel number of each detection filtering image data in a range of multiple groups of gray-scale values to obtain multiple groups of detection coefficients; and comparing the detection coefficients corresponding to the detection filtering image data in the same gray-scale value range to obtain a second comparison result, and obtaining the characteristic gray-scale value range according to the second comparison result.
3. The lens fogging identification method according to claim 2,
the defogging coefficient is obtained according to the ratio of the number of the characteristic pixels to the total number of the pixels;
the detection coefficients are obtained from the ratio of the number of pixels in each gray-scale value range of the detection-filtered image data to the total number of pixels.
4. The lens fogging identification method according to claim 2, wherein the plurality of fogging states include a fogging state in which a fogging amount is 0% and at least one group of fogging states in which a fogging amount is greater than 0%.
5. The method of claim 4, wherein the range of characteristic grayscale values is obtained by a method further comprising:
and comparing the plurality of groups of detection coefficients, and selecting a gray level value range corresponding to the detection coefficient obtained in the state that the fog amount is greater than 0% and the detection coefficient obtained in the state that the fog amount is greater than 0% as the characteristic gray level value range.
6. The mirror band fog identifying method according to claim 4, wherein the defogging threshold value is a detection coefficient of the detection image obtained in a state where a fog amount is 0% at the characteristic gray scale value range.
7. The lens fogging recognition method according to claim 2, wherein the plurality of detection images are a plurality of detection images obtained under different ambient brightness states.
8. The lens fogging identification method according to claim 7, wherein the ambient brightness is obtained by an exposure time or gain provided by a sensing device for assisting in capturing an image; or the ambient brightness is obtained by an ISO value, an EV value, an AE value or an ISP gain value provided by the photosensitive element.
9. The lens fogging identification method according to claim 1, wherein the defogging processing method further includes a step of capturing the target image data by using a camera, and the defogging processing method in the step of determining whether to perform defogging processing according to the first comparison result includes heating the camera.
10. A camera module comprising a camera assembly for acquiring target image data and a camera processor for performing the steps of the method according to any one of claims 1 to 8 to defogg the camera assembly.
11. A terminal device, characterized in that it comprises a camera, a processor and a memory, in which a computer program is provided that is executable on the processor, the processor executing the computer program according to the steps of the method according to any one of claims 1 to 8.
12. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110353451.3A CN113076997B (en) | 2021-03-31 | 2021-03-31 | Lens band fog identification method, camera module and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110353451.3A CN113076997B (en) | 2021-03-31 | 2021-03-31 | Lens band fog identification method, camera module and terminal equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113076997A true CN113076997A (en) | 2021-07-06 |
CN113076997B CN113076997B (en) | 2023-01-03 |
Family
ID=76614345
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110353451.3A Active CN113076997B (en) | 2021-03-31 | 2021-03-31 | Lens band fog identification method, camera module and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113076997B (en) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140177960A1 (en) * | 2012-12-24 | 2014-06-26 | Korea University Research And Business Foundation | Apparatus and method of processing image |
US20140247968A1 (en) * | 2011-11-16 | 2014-09-04 | Bayerische Motoren Werke Aktiengesellschaft | Method for Fog Detection |
CN104077750A (en) * | 2014-06-18 | 2014-10-01 | 深圳市金立通信设备有限公司 | Image processing method |
CN104217215A (en) * | 2014-08-28 | 2014-12-17 | 哈尔滨工程大学 | Classification and identification method for foggy water surface image and clear water surface image |
CN104713526A (en) * | 2015-04-01 | 2015-06-17 | 无锡桑尼安科技有限公司 | Method for detecting types of foreign matters on power transmission line |
CN104899833A (en) * | 2014-03-07 | 2015-09-09 | 安凯(广州)微电子技术有限公司 | Image defogging method and device |
CN109389075A (en) * | 2018-09-29 | 2019-02-26 | 佛山市云米电器科技有限公司 | Intelligent smoke machine lens blur self checking method |
CN110458029A (en) * | 2019-07-15 | 2019-11-15 | 佛山科学技术学院 | Vehicle checking method and device in a kind of foggy environment |
CN110532876A (en) * | 2019-07-26 | 2019-12-03 | 纵目科技(上海)股份有限公司 | Night mode camera lens pays detection method, system, terminal and the storage medium of object |
CN110807406A (en) * | 2019-10-29 | 2020-02-18 | 浙江大华技术股份有限公司 | Foggy day detection method and device |
CN110992327A (en) * | 2019-11-27 | 2020-04-10 | 北京达佳互联信息技术有限公司 | Lens contamination state detection method and device, terminal and storage medium |
CN111738064A (en) * | 2020-05-11 | 2020-10-02 | 南京邮电大学 | Haze concentration identification method for haze image |
-
2021
- 2021-03-31 CN CN202110353451.3A patent/CN113076997B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140247968A1 (en) * | 2011-11-16 | 2014-09-04 | Bayerische Motoren Werke Aktiengesellschaft | Method for Fog Detection |
US20140177960A1 (en) * | 2012-12-24 | 2014-06-26 | Korea University Research And Business Foundation | Apparatus and method of processing image |
CN104899833A (en) * | 2014-03-07 | 2015-09-09 | 安凯(广州)微电子技术有限公司 | Image defogging method and device |
CN104077750A (en) * | 2014-06-18 | 2014-10-01 | 深圳市金立通信设备有限公司 | Image processing method |
CN104217215A (en) * | 2014-08-28 | 2014-12-17 | 哈尔滨工程大学 | Classification and identification method for foggy water surface image and clear water surface image |
CN104713526A (en) * | 2015-04-01 | 2015-06-17 | 无锡桑尼安科技有限公司 | Method for detecting types of foreign matters on power transmission line |
CN109389075A (en) * | 2018-09-29 | 2019-02-26 | 佛山市云米电器科技有限公司 | Intelligent smoke machine lens blur self checking method |
CN110458029A (en) * | 2019-07-15 | 2019-11-15 | 佛山科学技术学院 | Vehicle checking method and device in a kind of foggy environment |
CN110532876A (en) * | 2019-07-26 | 2019-12-03 | 纵目科技(上海)股份有限公司 | Night mode camera lens pays detection method, system, terminal and the storage medium of object |
CN110807406A (en) * | 2019-10-29 | 2020-02-18 | 浙江大华技术股份有限公司 | Foggy day detection method and device |
CN110992327A (en) * | 2019-11-27 | 2020-04-10 | 北京达佳互联信息技术有限公司 | Lens contamination state detection method and device, terminal and storage medium |
CN111738064A (en) * | 2020-05-11 | 2020-10-02 | 南京邮电大学 | Haze concentration identification method for haze image |
Non-Patent Citations (2)
Title |
---|
杜晶晶: "交通图像去雾方法及应用研究", 《中国优秀硕士学位论文全文数据库 (工程科技Ⅱ辑)》 * |
汤嘉立等: "雾天车辆超分辨率视频图像清晰度识别仿真", 《计算机仿真》 * |
Also Published As
Publication number | Publication date |
---|---|
CN113076997B (en) | 2023-01-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10070053B2 (en) | Method and camera for determining an image adjustment parameter | |
US11574481B2 (en) | Camera blockage detection for autonomous driving systems | |
KR102566998B1 (en) | Apparatus and method for determining image sharpness | |
US20110102628A1 (en) | Foreground/Background Segmentation in Digital Images | |
CN112767392B (en) | Image definition determining method, device, equipment and storage medium | |
US9909859B2 (en) | Apparatus and method for measuring visual range using geometrical information of an image and an image pattern recognition technique | |
CN101142812A (en) | Image processing device and method, program and recording medium | |
WO2014055984A1 (en) | Imaging through aerosol obscurants | |
WO2016150593A1 (en) | Method for generating a digital record and roadside unit of a road toll system implementing the method | |
CN110572636B (en) | Camera contamination detection method and device, storage medium and electronic equipment | |
EP2609567A1 (en) | Sensor data processing | |
CN109916415B (en) | Road type determination method, device, equipment and storage medium | |
CN101764922B (en) | Method and device for adaptive generation of luminance threshold | |
CN111369317B (en) | Order generation method, order generation device, electronic equipment and storage medium | |
KR20210040258A (en) | A method and apparatus for generating an object classification for an object | |
CN111654643B (en) | Exposure parameter determination method and device, unmanned aerial vehicle and computer readable storage medium | |
CN113076997B (en) | Lens band fog identification method, camera module and terminal equipment | |
US20130070966A1 (en) | Method and device for checking the visibility of a camera for surroundings of an automobile | |
Hertel et al. | Image quality standards in automotive vision applications | |
CN112861676B (en) | Smoke and fire identification marking method, system, terminal and storage medium | |
CN117315350B (en) | Hot spot detection method and device for photovoltaic solar panel based on unmanned aerial vehicle | |
US8818093B2 (en) | Method and device for analyzing an image of an image recording device for a vehicle | |
JP2016103787A (en) | Image processing device, image processing system, image processing method, and image processing program | |
US20230186639A1 (en) | Camera blockage detection for autonomous driving systems | |
JP2012023572A (en) | White balance coefficient calculating device and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |