CN116957984A - Method and system for monitoring unmanned aerial vehicle based on low-illuminance haze - Google Patents
Method and system for monitoring unmanned aerial vehicle based on low-illuminance haze Download PDFInfo
- Publication number
- CN116957984A CN116957984A CN202311001014.0A CN202311001014A CN116957984A CN 116957984 A CN116957984 A CN 116957984A CN 202311001014 A CN202311001014 A CN 202311001014A CN 116957984 A CN116957984 A CN 116957984A
- Authority
- CN
- China
- Prior art keywords
- light source
- input image
- transmittance
- image
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 59
- 238000012544 monitoring process Methods 0.000 title claims abstract description 35
- 238000002834 transmittance Methods 0.000 claims abstract description 78
- 239000011159 matrix material Substances 0.000 claims abstract description 51
- 238000001914 filtration Methods 0.000 claims abstract description 41
- 238000005286 illumination Methods 0.000 claims abstract description 38
- 238000012545 processing Methods 0.000 claims abstract description 31
- 230000007246 mechanism Effects 0.000 claims abstract description 23
- 238000012937 correction Methods 0.000 claims abstract description 16
- 230000004927 fusion Effects 0.000 claims abstract description 10
- 238000007781 pre-processing Methods 0.000 claims abstract description 8
- 238000004364 calculation method Methods 0.000 claims description 33
- 230000008569 process Effects 0.000 claims description 26
- 230000003044 adaptive effect Effects 0.000 claims description 5
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 17
- 230000004075 alteration Effects 0.000 abstract description 4
- 239000003086 colorant Substances 0.000 abstract 1
- 230000008901 benefit Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000000903 blocking effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- SLXKOJJOQWFEFD-UHFFFAOYSA-N 6-aminohexanoic acid Chemical compound NCCCCCC(O)=O SLXKOJJOQWFEFD-UHFFFAOYSA-N 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The application provides a method and a system for monitoring an unmanned aerial vehicle based on low-illumination haze, and relates to the field of image processing; the method comprises the steps of preprocessing a low-illumination haze image in a mixed mode of side window filtering and rapid edge protection filtering to obtain a more accurate ambient light estimated value; then combining the image and the obtained ambient light estimated value, setting a light source threshold value, and separately solving the transmittance of the light source region and the transmittance of the non-light source region through a self-adaptive light source matrix mechanism, so as to further refine and solve the light source region and better reserve the brightness value of the light source region; then, the transmissivity is subjected to fusion solution, and light source compensation is carried out on the transmissivity image in a gamma correction mode; and finally, defogging the image under the low-illumination haze by combining an atmospheric scattering model. The processed image is bright as a whole, the light source area has no halation effect, the sky area has small chromatic aberration, the image has clear texture details and vivid colors, and people can efficiently monitor the unmanned aerial vehicle and observe dangerous articles carried by the unmanned aerial vehicle.
Description
Technical Field
The application relates to the technical field of image processing, in particular to a method and a system for monitoring an unmanned aerial vehicle based on low-illumination haze.
Background
With the development of technology, people have been paying more attention to unmanned aerial vehicles in recent years. Besides application in military directions, the application scene of the unmanned aerial vehicle is continuously expanded in the civil field. Such as: entertainment is taken photo by plane, agricultural plant protection, police security protection, electric power inspection, unmanned aerial vehicle survey and drawing, commodity circulation transportation, arrange performances etc. fields, unmanned aerial vehicle all has the unusual performance. However, unmanned aerial vehicle technology has some drawbacks while providing benefits. For aviation departments, problems such as difficult discovery, difficult supervision, difficult law enforcement and the like exist. During the period of taking off and landing of the aircraft, if the actions of the unmanned aerial vehicle cannot be recognized in time, some dangerous objects possibly carried on the unmanned aerial vehicle are observed, the crash of the aircraft is easily caused, the life safety and the benefit loss of the human are threatened, and the unmanned aerial vehicle can be effectively monitored on the premise of controlling the unmanned aerial vehicle.
For the working environment of the unmanned aerial vehicle. Under daytime scene, the unmanned aerial vehicle is easily caught by the image collecting equipment, the actions of the unmanned aerial vehicle are monitored, and dangerous objects carried by the unmanned aerial vehicle are identified. However, when the unmanned aerial vehicle is in a low-illumination environment accompanied by haze, the difficulty of monitoring is increased. The existing low-illumination defogging method has a plurality of defects. The problems of poor color saturation, fuzzy texture details, large noise and the like exist. Aiming at partial researchers of the problem of uneven illumination at night, the prior art provides a defogging algorithm of firstly compensating illumination and then correcting color; although the color looks like an improvement in effect, due to inaccuracy of illumination estimation in compensation, a blazed area in the graph is not well processed, so that the flare of the restored image is obvious and the noise is large; some researchers propose to implement defogging with illumination compensation and color correction after defogging, but transmission cannot be reasonably estimated, resulting in distortion of the finally restored image color, and poor defogging effect. In addition, some researchers consider that the artificial light source at night has phenomena such as blaze and illumination non-uniformity, so that a blaze layer is added into a standard daytime defogging model, a layer separation result is obtained by removing the blaze layer, then night atmospheric light is estimated in a blocking mode again, and the transmissivity is estimated through a dark channel theory, so that a restoration chart is obtained; although the defogging effect is good, the restored image is dark as a whole and the texture details are not clear because of no processing such as illumination compensation, brightness enhancement and the like; and the image is defogged and then displayed in dark, so that details are lost.
Some solutions are also proposed for monitoring unmanned aerial vehicles in the part of the published patent application documents, for example patent application CN108734670a and patent application CN115170404a. Although the low-illumination defogging method is improved by the above-mentioned disclosed scheme, the application is too wide, and unmanned aerial vehicle images under the low-illumination haze condition cannot be processed in a targeted manner. For example, the actual operating state of the drone is not considered; when the unmanned aerial vehicle works under the low-illumination condition, the unmanned aerial vehicle is provided with lights, and the lights can influence defogging images and easily generate halation effects. In addition, when a large-area sky area is contained, serious influences such as color cast and artifacts can appear in the sky area after defogging, so that the defogging effect of an image is poor, and a unmanned aerial vehicle is difficult to observe better. In patent application CN115616479a, a special device and system are disclosed for monitoring the unmanned aerial vehicle, but dangerous objects carried by the unmanned aerial vehicle cannot be directly known, and the dangerous objects may cause serious accidents.
Therefore, based on the technical problem that the unmanned aerial vehicle monitors in the special scene, a low-illumination defogging method capable of being applied to the special scene is needed, so that the unmanned aerial vehicle can be better monitored, and dangerous objects carried by the unmanned aerial vehicle can be identified.
Disclosure of Invention
The application aims to provide a method and a system for monitoring an unmanned aerial vehicle based on low-illuminance haze, which are used for improving a defogging method on the basis of a dark channel priori theory, improving the quality of an output image, solving the problem of monitoring the unmanned aerial vehicle under the condition of low-illuminance haze and realizing the purpose of identifying dangerous objects carried by the unmanned aerial vehicle under complex conditions.
In order to achieve the above purpose, the present application proposes the following technical scheme:
in a first aspect, a method for monitoring a drone based on low-illuminance haze is disclosed, comprising:
acquiring pixels of an input image, and preprocessing the input image by adopting a mixed mode of edge window filtering and rapid edge protection filtering to obtain an ambient light estimated value of the input image;
setting a light source threshold of an input image according to the ambient light estimated value;
dividing a light source region and a non-light source region of an input image according to the light source threshold value, respectively optimizing the transmittance of the light source region and the non-light source region by adopting a self-adaptive light source matrix mechanism, and then fusing and solving the initial transmittance;
performing light source compensation on the initial transmittance to obtain and output the final transmittance of the input image;
and carrying out defogging calculation on the input image according to the ambient light estimated value and the final transmissivity and the atmospheric scattering model to obtain a defogged output image.
Further, the process of processing the input image by adopting a mixed mode of edge window filtering and rapid edge protection filtering includes:
acquiring pixels of an input image, adopting edge window filtering to treat each pixel as a potential edge, and generating a plurality of edge windows around each pixel;
processing the input image, including adjusting brightness of the input image, obtaining and outputting an edge window with minimum Euclidean distance between each pixel and the input image thereof so as to preserve edge information of the input image;
obtaining a filtered image according to the output edge window with the minimum distance, and processing the filtered image by adopting quick edge protection filtering to obtain a preprocessed image;
and calculating an ambient light estimated value in a dark channel priori mode according to the preprocessed image.
Further, the process of setting the light source threshold of the input image according to the ambient light estimated value includes:
calculating luminosity values of all pixel points in an input image;
and calculating the difference value between the luminosity value and the estimated value of the ambient light, and taking the absolute value of the largest difference value as the light source threshold value.
Further, the process of dividing the light source region and the non-light source region of the input image according to the light source threshold value, respectively optimizing the transmittance of the light source region and the non-light source region by adopting an adaptive light source matrix mechanism, and then fusing and solving the initial transmittance includes:
dividing any pixel point in the input image into a light source area or a non-light source area; when the luminosity value of the pixel point is larger than the light source threshold value, dividing the pixel point into a light source region, otherwise, dividing the pixel point into a non-light source region;
when the pixel points belong to a light source area, calculating and optimizing the transmissivity of the light source area by adopting a self-adaptive light source matrix mechanism; comprising the following steps:
numbering each pixel point in an input image according to a light source matrix mechanism in descending order of luminosity values, and calculating a first light source influence matrix, a first light source influence matrix and a pixel light source influence matrix of each pixel point; specifically, the size of the input image is defined as w×h, x is 1, w [, y is 1, h [, and each pixel point is numbered as { m }, in descending order of luminosity values 0 ,m 1 ,...,m T-1 T is the total number of pixels; then the first time period of the first time period,
at the value of x, the first light source affects the matrix k x The calculation is as follows:
at the value of y, the second light source affects the matrix k y The calculation is as follows:
the pixel light source influence matrix K x The calculation is as follows:
wherein C is x 、C y Representing luminosity values of the pixel points under corresponding x and y values, M is a light source threshold value, and d x,m Representing other thingsThe distance from the pixel to the selected pixel point;
obtaining an adjustment correction coefficient w for optimizing the transmissivity of the light source area according to the pixel light source influence matrix x ,
Wherein α represents an adjustment correction factor, t, of the input image x Initial transmittance;
when the pixel points belong to a non-light source area, calculating the transmissivity of the non-light source area through a dark channel priori theory;
calculating initial transmittance t according to the optimized transmittance of the light source area and the transmittance of the non-light source area in a fusion way M ,
Wherein omega represents the light source region,indicating the transmittance of the non-light source region, w x t x∈Ω Indicating the transmittance of the light source region.
Further, the process of obtaining and outputting the final transmittance of the input image by performing light source compensation on the initial transmittance is as follows:
and adopting a gamma correction mode, and carrying out light source compensation on the initial transmittance by adjusting a compensation coefficient to obtain the final transmittance.
Further, the process of obtaining the defogged output image includes:
and according to the estimated value of the ambient light and the final transmissivity, defogging calculation is carried out on any pixel point in the input image according to an atmospheric scattering model, wherein the formula is as follows:
wherein I is x An initial image representing any pixel point in an input image, t F Is the final transmittance, J x Representing the defogging processed image of the pixel point;
and synthesizing the defogging processed images of the pixel points to obtain defogged output images.
In a second aspect, a low-illuminance haze-based unmanned aerial vehicle monitoring system is disclosed, comprising:
the acquisition processing module is used for acquiring pixels of the input image, preprocessing the input image by adopting a mixed mode of edge window filtering and rapid edge protection filtering, and acquiring an ambient light estimated value of the input image;
the setting module is used for setting a light source threshold value of an input image according to the ambient light estimated value;
the division solving module is used for dividing a light source region and a non-light source region of the input image according to the light source threshold value, respectively optimizing the transmittance of the light source region and the non-light source region by adopting a self-adaptive light source matrix mechanism, and then fusing and solving the initial transmittance;
the compensation module is used for carrying out light source compensation on the initial transmittance to obtain and output the final transmittance of the input image;
and the calculation module is used for carrying out defogging calculation on the input image according to the estimated value of the ambient light and the final transmissivity and the atmospheric scattering model to obtain a defogged output image.
Further, the execution unit for obtaining the ambient light estimated value of the input image by the obtaining processing module includes:
the acquisition unit is used for acquiring pixels of an input image, adopting edge window filtering to treat each pixel as a potential edge, and generating a plurality of edge windows around each pixel;
the adjusting unit is used for processing the input image and comprises adjusting the brightness of the input image, and obtaining and outputting an edge window with the minimum Euclidean distance between each pixel and the input image so as to keep the edge information of the input image;
the processing unit is used for obtaining a filtered image according to the output edge window with the minimum distance, and processing the filtered image by adopting quick edge protection filtering to obtain a preprocessed image;
and the calculating unit is used for calculating an ambient light estimated value in a dark channel prior mode according to the preprocessed image.
In a third aspect, a computer device is disclosed comprising at least one processor coupled to a memory storing programs or instructions that are executed on the processor to implement the steps of the method of monitoring a drone based on low light haze as described above.
In a fourth aspect, a readable storage medium is disclosed, on which a program or instructions are stored, which when executed by a processor, implement the steps of the method for monitoring a drone based on low-light haze as described above.
According to the technical scheme, the following beneficial effects are achieved:
the application discloses a method and a system for monitoring an unmanned aerial vehicle based on low-illuminance haze, which aim to solve the problem of monitoring the unmanned aerial vehicle under the condition of low-illuminance haze, and have the following advantages in specific application:
(1) According to the scheme, the low-illumination haze image is preprocessed in a mode of mixing side window filtering and rapid edge protection filtering, the influence of a high light source area and a pseudo light source area on an image solving environment light estimated value is eliminated, and the accuracy of the environment light estimated value is improved; the problem that the low-illumination haze image is displayed too dark after defogging is solved, and the unmanned aerial vehicle can be monitored more clearly.
(2) According to the scheme, a light source threshold is set according to the brightness value of the image and the processed ambient light estimated value, the light source matrix mechanism is adopted to perform fusion solution on the transmittance of different light source areas, then the gamma correction is performed to perform light source compensation, so that a final transmittance image is obtained, the problem that the low-illumination image generates halation effect in the light source area after defogging is solved, the texture detail of the image is improved, an unmanned aerial vehicle under the condition of low-illumination haze can be better monitored, and dangerous objects carried by the unmanned aerial vehicle can be identified.
(3) According to the scheme, the display effect of the sky area is improved, the chromatic aberration is reduced, the problem of easy generation of artifacts and noise is solved, the monitoring effect of the unmanned aerial vehicle is prevented from being influenced, and carried dangerous articles cannot be clearly identified through modifying the ambient light estimation and optimizing the solving process of the transmissivity.
It should be understood that all combinations of the foregoing concepts, as well as additional concepts described in more detail below, may be considered a part of the inventive subject matter of the present disclosure as long as such concepts are not mutually inconsistent.
The foregoing and other aspects, embodiments, and features of the present teachings will be more fully understood from the following description, taken together with the accompanying drawings. Other additional aspects of the application, such as features and/or advantages of the exemplary embodiments, will be apparent from the description which follows, or may be learned by practice of the embodiments according to the teachings of the application.
Drawings
The drawings are not intended to be drawn to scale with respect to true references. In the drawings, each identical or nearly identical component that is illustrated in various figures may be represented by a like numeral. For purposes of clarity, not every component may be labeled in every drawing. Embodiments of various aspects of the application will now be described, by way of example, with reference to the accompanying drawings, in which:
fig. 1 is a flow chart of a method for monitoring a unmanned aerial vehicle based on low-illuminance haze, disclosed by the application;
FIG. 2 is a flow chart of the present application for obtaining an ambient light estimate of an input image;
FIG. 3 is a flow chart of the present application for setting the light source threshold of an input image;
fig. 4 is a frame diagram of a system for monitoring a unmanned aerial vehicle based on low-illuminance haze according to the present disclosure;
FIG. 5 is a schematic diagram of an electronic device according to an embodiment of the application;
in fig. 6, (a), (b) and (c) are all monitoring and shooting diagrams of the unmanned aerial vehicle under the low-illumination haze condition in the embodiment;
fig. 7 (a), (b), and (c) are effect diagrams after the monitoring and mixing filtering process of the unmanned aerial vehicle in fig. 6;
fig. 8 (a), (b), and (c) are transmittance images obtained by fusion of the adaptive light source matrix mechanisms of fig. 7;
fig. 9 (a), (b) and (c) are graphs of the monitoring effect of the unmanned aerial vehicle after defogging in fig. 8.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings of the embodiments of the present application. It will be apparent that the described embodiments are some, but not all, embodiments of the application. All other embodiments, which can be made by a person skilled in the art without creative efforts, based on the described embodiments of the present application fall within the protection scope of the present application. Unless defined otherwise, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs.
The terms "first," "second," and the like in the description and in the claims, are not used for any order, quantity, or importance, but are used for distinguishing between different elements. Also, unless the context clearly indicates otherwise, singular forms "a," "an," or "the" and similar terms do not denote a limitation of quantity, but rather denote the presence of at least one. The terms "comprises," "comprising," or the like are intended to cover a feature, integer, step, operation, element, and/or component recited as being present in the element or article that "comprises" or "comprising" does not exclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. "up", "down", "left", "right" and the like are used only to indicate a relative positional relationship, and when the absolute position of the object to be described is changed, the relative positional relationship may be changed accordingly.
Based on prior art, when using the picture technique to realize low illuminance haze under the condition to unmanned aerial vehicle monitoring, there is following problem, include: 1) The prior dark channel priori theory has the advantages that when the haze image in the daytime is processed, the selected ambient light estimated value is single, and under the condition of night, the ambient light value is solved, deviation appears, and the defogging effect is dark; 2) Although the guide filtering can alleviate the problem of blocking effect of the transmissivity image caused by image blocking during the transmissivity processing, when the difference of the coarse transmissivity values of two adjacent blocks is larger, the residual blocking effect is still more obvious, which can cause obvious halation effect to appear near the finally output image strong lamplight; in addition, although the existing algorithm is improved, the influence of the sky area is ignored, and color difference and artifacts appear in the sky area of the image. Therefore, the application provides a picture low-illumination defogging processing method which can be applied to special scenes such as low-illumination haze and the like, solves the problems, and realizes the monitoring and recognition of dangerous objects carried by an unmanned aerial vehicle.
The application discloses a method and a system for monitoring an unmanned aerial vehicle under low-illumination haze based on the embodiment shown in the attached drawings.
Referring to fig. 1, the method for monitoring an unmanned aerial vehicle based on low-illuminance haze disclosed in the embodiment includes the following steps:
step S102, obtaining pixels of an input image, and preprocessing the input image by adopting a mixed mode of edge window filtering and rapid edge protection filtering to obtain an ambient light estimated value of the input image; the method aims at processing high light source points and pseudo light source points of the image and eliminating influence on ambient light estimation.
Step S104, setting a light source threshold of an input image according to the ambient light estimated value;
step S106, dividing a light source region and a non-light source region of an input image according to the light source threshold value, respectively optimizing the transmittance of the light source region and the non-light source region by adopting a self-adaptive light source matrix mechanism, and then fusing and solving the initial transmittance;
step S108, performing light source compensation on the initial transmittance to obtain and output the final transmittance of the input image;
and step S110, defogging calculation is carried out on the input image according to the estimated value of the ambient light and the final transmissivity and the atmospheric scattering model, so as to obtain a defogged output image.
Specifically, as shown in fig. 2, the process of processing the input image by adopting a mixed mode of edge window filtering and fast edge protection filtering includes the following steps: step S1022, obtaining pixels of the input image, adopting edge window filtering to treat each pixel as a potential edge, and generating a plurality of edge windows around each pixel; step S1024, processing the input image, including adjusting the brightness of the input image, obtaining and outputting an edge window with the minimum Euclidean distance between each pixel and the input image thereof so as to preserve the edge information of the input image; step S1026, obtaining a filtered image according to the output edge window with the minimum distance, and processing the filtered image by adopting quick edge protection filtering to obtain a preprocessed image; step 1028, calculating an ambient light estimated value by adopting a dark channel prior mode according to the preprocessed image.
The calculation process of obtaining the edge window with the smallest distance by adopting the edge window filtering process in steps S1022 to S1024 is as follows:
wherein, s= { U, D, L, R, NM, NE, SW, SE }, representing the set of eight edge windows generated around the pixel; n epsilon S, representing any edge window of the set S; i n Representing filtered values of the edge window; q i Representation baseWeights of pixels j near the target pixel i of the kernel F; w (w) ij Pixel values for pixels; n (N) n Representing eight edge window pixel value sums; i sw Representing the edge window with the smallest euclidean distance from the pixel input image.
Step S104 is a process of setting a light source threshold of an input image according to the ambient light estimation value, and as shown in fig. 3, includes the following steps: step S1042, calculating luminosity values of each pixel point in the input image; step S1044, calculating a difference between the luminosity value and the estimated ambient light value, and taking an absolute value of the largest difference as the light source threshold.
That is, the luminosity value I of each pixel point of the image is obtained x Difference with the estimated value A of the ambient light, and I is calculated x -A| max As a light source threshold, it is used to distinguish a light source region from a non-light source region.
In a specific embodiment, the step S106 divides a light source area and a non-light source area of the input image according to the light source threshold, and adopts an adaptive light source matrix mechanism to optimize the transmittance of the light source area and the non-light source area respectively, and then the process of fusion solving the initial transmittance includes the following calculation process:
dividing any pixel point in the input image into a light source area or a non-light source area; when the luminosity value of the pixel point is larger than the light source threshold value, dividing the pixel point into a light source region, otherwise, dividing the pixel point into a non-light source region;
when the pixel point belongs to the light source region, I x >|I x -A| max Calculating and optimizing the transmissivity of the light source area by adopting a self-adaptive light source matrix mechanism; comprising the following steps:
numbering each pixel point in an input image according to a light source matrix mechanism in descending order of luminosity values, and calculating a first light source influence matrix, a first light source influence matrix and a pixel light source influence matrix of each pixel point; specifically, the size of the input image is defined as w.h, x.epsilon.1, w],y∈[1,h]Numbering each pixel point in descending order of luminosity value as { m } 0 ,m 1 ,...,m T-1 (where T is the pixel)Total number of points; then the first time period of the first time period,
at the value of x, the first light source affects the matrix k x The calculation is as follows:
at the value of y, the second light source affects the matrix k y The calculation is as follows:
the pixel light source influence matrix K x The calculation is as follows:
wherein C is x 、C y Representing luminosity values of the pixel points under corresponding x and y values, M is a light source threshold value, and d x,m Representing the distance from other pixels to the selected pixel point;
obtaining an adjustment correction coefficient w for optimizing the transmissivity of the light source area according to the pixel light source influence matrix x ,
Where α represents an adjustment correction factor of the input image, and is set to 0.5 in the embodiment; t is t x Representing the initial transmittance;
when the pixel point belongs to the non-light source area, I x ≤|I x -A| max Calculating the transmissivity of the non-light source region through a dark channel prior theory;
calculating initial transmittance t according to the optimized transmittance of the light source area and the transmittance of the non-light source area in a fusion way M ,
Wherein omega represents the light source region,indicating the transmittance of the non-light source region, w x t x∈Ω Indicating the transmittance of the light source region.
Further combining the embodiment, the application adopts a gamma correction mode, carries out light source compensation on the initial transmittance by adjusting the compensation coefficient, and finally outputs the final transmittance of the input image; in practice, the compensation factor is set to 0.8.
The process of obtaining the defogged output image in step S110 of the present application includes the following calculation processes:
firstly, defogging calculation is carried out on any pixel point in the input image according to the ambient light estimated value and the final transmissivity and the atmospheric scattering model, wherein the formula is as follows:
wherein I is x An initial image representing any pixel point in an input image, t F Is the final transmittance, J x Representing the defogging processed image of the pixel point;
and then, combining the defogging processed images of the pixel points to obtain defogged output images.
According to the method for monitoring the unmanned aerial vehicle under the low-illumination haze, firstly, the low-illumination haze image is preprocessed in a mode of mixing side window filtering and rapid edge protection filtering, so that the purpose of eliminating the influence of a high light source area and a pseudo light source area on the image solving environment light estimated value is achieved; setting a light source threshold according to the image and the ambient light estimated value, adopting a light source matrix mechanism to perform fusion solution on the transmittance of different light source areas, and performing light source compensation through gamma correction to obtain a final transmittance image, thereby solving the problem that the low-illumination image generates halation effect in the defogging light source areas; finally, by modifying the ambient light estimation and optimizing the solving process of the transmissivity, the display effect of the sky area is improved, the chromatic aberration is lightened, and the problem that artifacts and noise are easy to generate in the image processing process is solved. The method can be effectively applied to the unmanned aerial vehicle identification problem under special scenes such as low-illumination haze and the like, and improves the safety control of the unmanned aerial vehicle.
In an embodiment of the present application, there is further provided a computer device, as shown in fig. 5, where the device includes at least one processor, and a memory coupled to the processor, where the memory stores a program or an instruction running on the processor, and where the program or the instruction is executed by the processor to implement the steps of the method for monitoring a drone based on low-light haze as disclosed in the above embodiment.
The above-described programs may be run on a processor or may be stored in a memory, i.e., a computer readable medium, including both permanent and non-permanent, removable and non-removable media, which may be implemented by any method or technology for information storage. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media, such as modulated data signals and carrier waves.
These computer programs may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks and/or block diagram block or blocks, and corresponding method steps may be implemented in different modules.
In this embodiment, there is provided a device or system, which may be referred to as a low-light haze-based unmanned aerial vehicle monitoring system, as shown in fig. 4, comprising: the acquisition processing module is used for acquiring pixels of the input image, preprocessing the input image by adopting a mixed mode of edge window filtering and rapid edge protection filtering, and acquiring an ambient light estimated value of the input image; the setting module is used for setting a light source threshold value of an input image according to the ambient light estimated value; the division solving module is used for dividing a light source region and a non-light source region of the input image according to the light source threshold value, respectively optimizing the transmittance of the light source region and the non-light source region by adopting a self-adaptive light source matrix mechanism, and then fusing and solving the initial transmittance; the compensation module is used for carrying out light source compensation on the initial transmittance to obtain and output the final transmittance of the input image; and the calculation module is used for carrying out defogging calculation on the input image according to the estimated value of the ambient light and the final transmissivity and the atmospheric scattering model to obtain a defogged output image.
The steps for monitoring the unmanned aerial vehicle system based on low-illumination haze disclosed in the above embodiments are already described, and are not described in detail herein.
For example, an execution unit that obtains an ambient light estimation value of an input image by an acquisition processing module includes: the acquisition unit is used for acquiring pixels of an input image, adopting edge window filtering to treat each pixel as a potential edge, and generating a plurality of edge windows around each pixel; the adjusting unit is used for processing the input image and comprises adjusting the brightness of the input image, and obtaining and outputting an edge window with the minimum Euclidean distance between each pixel and the input image so as to keep the edge information of the input image; the processing unit is used for obtaining a filtered image according to the output edge window with the minimum distance, and processing the filtered image by adopting quick edge protection filtering to obtain a preprocessed image; and the calculating unit is used for calculating an ambient light estimated value in a dark channel prior mode according to the preprocessed image.
The process of obtaining the edge window with the minimum Euclidean distance between each pixel and the input image by the adjusting unit is as follows:
wherein, s= { U, D, L, R, NM, NE, SW, SE }, representing the set of eight edge windows generated around the pixel; n epsilon S, representing any edge window of the set S; i n Representing filtered values of the edge window; q i A weight representing a pixel j in the vicinity of the target pixel i based on the kernel function F; w (w) ij Pixel values for pixels; n (N) n Representing eight edge window pixel value sums; i sw Representing the edge window with the smallest euclidean distance from the pixel input image.
For another example, the setting module sets a light source threshold of the input image according to the ambient light estimated value, and the setting module includes the following execution units: the second calculation unit is used for calculating the luminosity value of each pixel point in the input image; and a third calculation unit, configured to calculate a difference between the luminosity value and the estimated ambient light value, and take an absolute value of the largest difference as the light source threshold.
For another example, the dividing and solving module divides a light source region and a non-light source region of the input image according to the light source threshold value, adopts an adaptive light source matrix mechanism to respectively optimize the transmittance of the light source region and the non-light source region, and then merges and solves the initial transmittance, which comprises the following calculation process:
dividing any pixel point in the input image into a light source area or a non-light source area; when the luminosity value of the pixel point is larger than the light source threshold value, dividing the pixel point into a light source region, otherwise, dividing the pixel point into a non-light source region;
when the pixel point belongs to the light source region, I x >|I x -A| max Calculating and optimizing the transmissivity of the light source area by adopting a self-adaptive light source matrix mechanism; comprising the following steps:
numbering each pixel point in an input image according to a light source matrix mechanism in descending order of luminosity values, and calculating a first light source influence matrix, a first light source influence matrix and a pixel light source influence matrix of each pixel point; specifically, the size of the input image is defined as w.h, x.epsilon.1, w],y∈[1,h]Numbering each pixel point in descending order of luminosity value as { m } 0 ,m 1 ,...,m T-1 T is the total number of pixels; then the first time period of the first time period,
at the value of x, the first light source affects the matrix k x The calculation is as follows:
at the value of y, the second light source affects the matrix k y The calculation is as follows:
the pixel light source influence matrix K x The calculation is as follows:
wherein C is x 、C y Representing luminosity values of the pixel points under corresponding x and y values, M is a light source threshold value, and d x,m Representing the distance from other pixels to the selected pixel point;
obtaining an adjustment correction coefficient w for optimizing the transmissivity of the light source area according to the pixel light source influence matrix x ,
Where α represents an adjustment correction factor of the input image, and is set to 0.5 in the embodiment; t is t x Representing the initial transmittance;
when the pixel point belongs to the non-light source area, I x ≤|I x -A| max Calculating the transmissivity of the non-light source region through a dark channel prior theory;
calculating initial transmittance t according to the optimized transmittance of the light source area and the transmittance of the non-light source area in a fusion way M ,
Wherein omega represents the light source region,indicating the transmittance of the non-light source region, w x t x∈Ω Indicating the transmittance of the light source region.
For another example, the process of obtaining the defogged output image by the computing module includes the following execution units:
the fourth calculation unit is configured to perform defogging calculation on any pixel point in the input image according to the ambient light estimated value and the final transmittance and the atmospheric scattering model, where the formula is:
wherein I is x An initial image representing any pixel point in an input image, t F Is the final transmittance, J x Representing the defogging processed image of the pixel point;
and the output unit is used for synthesizing the defogging processed images of the pixel points to obtain defogged output images.
The following describes the specific implementation process of the application fully in combination with unmanned aerial vehicle flight experiments in a low-illumination haze scene shown in the attached drawings.
Firstly, preprocessing light source points by utilizing a mode of mixing side window filtering and rapid edge protection filtering, and eliminating the influence of high light source points and pseudo light source point brightness values on solving an ambient light estimated value in a low-illumination haze scene; specifically, fig. 6 and fig. 7 are diagrams, where fig. 6 is a direct view of the image under the low-illumination haze condition, and fig. 7 is a processed image, so that accuracy of an ambient light value is improved.
Secondly, calculating the difference between the brightness value of each pixel of the image and the ambient light estimated value, taking the maximum value as a light source threshold value, and distinguishing a light source region from a non-light source region; for a non-light source area, solving the transmissivity by adopting a traditional dark channel prior theory; and for the light source area, after optimizing by adopting a light source matrix mechanism, fusion solving the transmissivity image; then, performing light source compensation on the transmissivity image in a gamma correction mode to obtain a final transmissivity image, as shown in fig. 8; as can be seen from the figure, the overall brightness and the detail are greatly improved, so that the light source area is well reserved, and the halation effect is not easy to occur when defogging is carried out.
Finally, defogging calculation is carried out on the low-illumination haze image by the environment light estimated value and the transmissivity image which are processed in the steps, and an output image after defogging is obtained; as shown in fig. 9, the whole image is displayed brightly and clearly, the light rays in the light source area are kept well, no halation effect exists, the chromatic aberration problem in the sky area is reduced, the influence of artifacts and noise is avoided, the unmanned aerial vehicle can be effectively monitored in a low-illumination haze field, the actions of the unmanned aerial vehicle nearby are recognized in the period of taking off and landing of the aircraft, and major accidents are avoided.
While the application has been described with reference to preferred embodiments, it is not intended to be limiting. Those skilled in the art will appreciate that various modifications and adaptations can be made without departing from the spirit and scope of the present application. Accordingly, the scope of the application is defined by the appended claims.
Claims (10)
1. The method for monitoring the unmanned aerial vehicle based on low-illumination haze is characterized by comprising the following steps of:
acquiring pixels of an input image, and preprocessing the input image by adopting a mixed mode of edge window filtering and rapid edge protection filtering to obtain an ambient light estimated value of the input image;
setting a light source threshold of an input image according to the ambient light estimated value;
dividing a light source region and a non-light source region of an input image according to the light source threshold value, respectively optimizing the transmittance of the light source region and the non-light source region by adopting a self-adaptive light source matrix mechanism, and then fusing and solving the initial transmittance;
performing light source compensation on the initial transmittance to obtain and output the final transmittance of the input image;
and carrying out defogging calculation on the input image according to the ambient light estimated value and the final transmissivity and the atmospheric scattering model to obtain a defogged output image.
2. The method for monitoring the unmanned aerial vehicle under the low-illumination haze according to claim 1, wherein the process of processing the input image by adopting a mixed mode of edge window filtering and rapid edge protection filtering comprises the following steps:
acquiring pixels of an input image, adopting edge window filtering to treat each pixel as a potential edge, and generating a plurality of edge windows around each pixel;
processing the input image, including adjusting brightness of the input image, obtaining and outputting an edge window with minimum Euclidean distance between each pixel and the input image thereof so as to preserve edge information of the input image;
obtaining a filtered image according to the output edge window with the minimum distance, and processing the filtered image by adopting quick edge protection filtering to obtain a preprocessed image;
and calculating an ambient light estimated value in a dark channel priori mode according to the preprocessed image.
3. The method of claim 1, wherein the step of setting a light source threshold of an input image according to the ambient light estimate comprises:
calculating luminosity values of all pixel points in an input image;
and calculating the difference value between the luminosity value and the estimated value of the ambient light, and taking the absolute value of the largest difference value as the light source threshold value.
4. The method for monitoring the unmanned aerial vehicle under the low-illumination haze according to claim 1, wherein the process of dividing the light source region and the non-light source region of the input image according to the light source threshold value, respectively optimizing the transmittance of the light source region and the non-light source region by adopting an adaptive light source matrix mechanism, and then fusing and solving the initial transmittance comprises the following steps:
dividing any pixel point in the input image into a light source area or a non-light source area; when the luminosity value of the pixel point is larger than the light source threshold value, dividing the pixel point into a light source region, otherwise, dividing the pixel point into a non-light source region;
when the pixel points belong to a light source area, calculating and optimizing the transmissivity of the light source area by adopting a self-adaptive light source matrix mechanism; comprising the following steps:
numbering each pixel point in an input image according to a light source matrix mechanism in descending order of luminosity values, and calculating a first light source influence matrix, a first light source influence matrix and a pixel light source influence matrix of each pixel point; specifically, the size of the input image is defined as w.h, x.epsilon.1, w],y∈[1,h]Numbering each pixel point in descending order of luminosity value as { m } 0 ,m 1 ,...,m T-1 T is the total number of pixels; then the first time period of the first time period,
at the value of x, the first light source affects the matrix k x The calculation is as follows:
at the value of y, the second light source affects the matrix k y The calculation is as follows:
the pixel light source influence matrix K x The calculation is as follows:
wherein C is x 、C y Representing luminosity values of the pixel points under corresponding x and y values, M is a light source threshold value, and d x,m Representing the distance from other pixels to the selected pixel point;
obtaining an adjustment correction coefficient w for optimizing the transmissivity of the light source area according to the pixel light source influence matrix x ,
Wherein α represents an adjustment correction factor, t, of the input image x Representing the initial transmittance;
when the pixel points belong to a non-light source area, calculating the transmissivity of the non-light source area through a dark channel priori theory;
calculating initial transmittance t according to the optimized transmittance of the light source area and the transmittance of the non-light source area in a fusion way M ,
Wherein the method comprises the steps ofOmega represents the area of the light source,indicating the transmittance of the non-light source region, w x t x∈Ω Indicating the transmittance of the light source region.
5. The method for monitoring the unmanned aerial vehicle under the low-illumination haze according to claim 1, wherein the process of performing light source compensation on the initial transmittance to obtain and output the final transmittance of the input image is as follows:
and adopting a gamma correction mode, and carrying out light source compensation on the initial transmittance by adjusting a compensation coefficient to obtain the final transmittance.
6. The method for monitoring a drone under low-light haze according to claim 1, wherein the process of obtaining the defogged output image comprises:
and according to the estimated value of the ambient light and the final transmissivity, defogging calculation is carried out on any pixel point in the input image according to an atmospheric scattering model, wherein the formula is as follows:
wherein I is x An initial image representing any pixel point in an input image, t F Is the final transmittance, J x Representing the defogging processed image of the pixel point;
and synthesizing the defogging processed images of the pixel points to obtain defogged output images.
7. Monitoring unmanned aerial vehicle system based on low-light haze down, its characterized in that includes:
the acquisition processing module is used for acquiring pixels of the input image, preprocessing the input image by adopting a mixed mode of edge window filtering and rapid edge protection filtering, and acquiring an ambient light estimated value of the input image;
the setting module is used for setting a light source threshold value of an input image according to the ambient light estimated value;
the division solving module is used for dividing a light source region and a non-light source region of the input image according to the light source threshold value, respectively optimizing the transmittance of the light source region and the non-light source region by adopting a self-adaptive light source matrix mechanism, and then fusing and solving the initial transmittance;
the compensation module is used for carrying out light source compensation on the initial transmittance to obtain and output the final transmittance of the input image;
and the calculation module is used for carrying out defogging calculation on the input image according to the estimated value of the ambient light and the final transmissivity and the atmospheric scattering model to obtain a defogged output image.
8. The low-light haze-based unmanned aerial vehicle system of claim 7, wherein the execution unit that the acquisition processing module obtains an ambient light estimate of the input image comprises:
the acquisition unit is used for acquiring pixels of an input image, adopting edge window filtering to treat each pixel as a potential edge, and generating a plurality of edge windows around each pixel;
the adjusting unit is used for processing the input image and comprises adjusting the brightness of the input image, and obtaining and outputting an edge window with the minimum Euclidean distance between each pixel and the input image so as to keep the edge information of the input image;
the processing unit is used for obtaining a filtered image according to the output edge window with the minimum distance, and processing the filtered image by adopting quick edge protection filtering to obtain a preprocessed image;
and the calculating unit is used for calculating an ambient light estimated value in a dark channel prior mode according to the preprocessed image.
9. A computer device comprising at least one processor coupled to a memory storing a program or instructions that, when executed by the processor, implement the steps of the low-light haze-based drone method of any one of claims 1 to 6.
10. A readable storage medium having stored thereon a program or instructions which when executed by a processor performs the steps of the low-light haze-based drone method of any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311001014.0A CN116957984A (en) | 2023-08-10 | 2023-08-10 | Method and system for monitoring unmanned aerial vehicle based on low-illuminance haze |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311001014.0A CN116957984A (en) | 2023-08-10 | 2023-08-10 | Method and system for monitoring unmanned aerial vehicle based on low-illuminance haze |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116957984A true CN116957984A (en) | 2023-10-27 |
Family
ID=88461901
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311001014.0A Pending CN116957984A (en) | 2023-08-10 | 2023-08-10 | Method and system for monitoring unmanned aerial vehicle based on low-illuminance haze |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116957984A (en) |
-
2023
- 2023-08-10 CN CN202311001014.0A patent/CN116957984A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9361670B2 (en) | Method and system for image haze removal based on hybrid dark channel prior | |
Li et al. | Nighttime haze removal with glow and multiple light colors | |
CN103218778B (en) | The disposal route of a kind of image and video and device | |
CN114119378A (en) | Image fusion method, and training method and device of image fusion model | |
CN102750674A (en) | Video image defogging method based on self-adapting allowance | |
CN104867121B (en) | Image Quick demisting method based on dark primary priori and Retinex theories | |
Wang et al. | Variational single nighttime image haze removal with a gray haze-line prior | |
CN110689490A (en) | Underwater image restoration method based on texture color features and optimized transmittance | |
CN106412448A (en) | Single-frame image based wide dynamic range processing method and system | |
CN112116536A (en) | Low-illumination image enhancement method and system | |
CN112907497B (en) | Image fusion method and image fusion device | |
CN111028187B (en) | Light-adaptive airborne double-light image reconnaissance device and method | |
CN105701783A (en) | Single image defogging method based on ambient light model and apparatus thereof | |
Wang et al. | Fast color balance and multi-path fusion for sandstorm image enhancement | |
CN109118450A (en) | A kind of low-quality images Enhancement Method under the conditions of dust and sand weather | |
Mei et al. | Single image dehazing using dark channel fusion and haze density weight | |
CN103379289A (en) | Imaging apparatus | |
CN108898561A (en) | A kind of defogging method, server and the system of the Misty Image containing sky areas | |
CN105118032B (en) | A kind of wide method for dynamically processing of view-based access control model system | |
CN116957984A (en) | Method and system for monitoring unmanned aerial vehicle based on low-illuminance haze | |
Nie et al. | Image Defogging Based on Joint Contrast Enhancement and Multi-scale Fusion | |
CN114529460A (en) | Low-illumination-scene intelligent highway monitoring rapid defogging method and device and electronic equipment | |
Chen et al. | Improved visibility of single hazy images captured in inclement weather conditions | |
CN106960421A (en) | Evening images defogging method based on statistical property and illumination estimate | |
CN111866476A (en) | Image shooting method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |