CN116245772A - Low-illumination unmanned aerial vehicle aerial image enhancement method and device - Google Patents

Low-illumination unmanned aerial vehicle aerial image enhancement method and device Download PDF

Info

Publication number
CN116245772A
CN116245772A CN202111484809.2A CN202111484809A CN116245772A CN 116245772 A CN116245772 A CN 116245772A CN 202111484809 A CN202111484809 A CN 202111484809A CN 116245772 A CN116245772 A CN 116245772A
Authority
CN
China
Prior art keywords
image
low
unmanned aerial
aerial vehicle
illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111484809.2A
Other languages
Chinese (zh)
Inventor
王殿伟
邢侦斌
韩鹏飞
房杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Posts and Telecommunications
Original Assignee
Xian University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Posts and Telecommunications filed Critical Xian University of Posts and Telecommunications
Priority to CN202111484809.2A priority Critical patent/CN116245772A/en
Publication of CN116245772A publication Critical patent/CN116245772A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application provides a low-illumination unmanned aerial vehicle aerial image enhancement method and device, and relates to the field of image processing. The method comprises the following steps: acquiring a low-illumination unmanned aerial vehicle aerial image in an RGB format; converting the low-illumination unmanned aerial vehicle aerial image in the RGB format into an HSV format; preprocessing the brightness component V, and acquiring an overexposed image corresponding to the low-illumination unmanned aerial vehicle aerial image based on the preprocessed brightness component V, the preprocessed hue component H and the preprocessed saturation component S; generating a medium exposure image based on the low-illumination unmanned aerial vehicle aerial image and the overexposed image; respectively designing corresponding fusion weight functions for the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image, and fusing the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image by utilizing the fusion weight functions to obtain fused images; and carrying out detail enhancement on the fused image to obtain an enhanced unmanned aerial vehicle aerial image.

Description

Low-illumination unmanned aerial vehicle aerial image enhancement method and device
Technical Field
The application relates to the technical field of image processing, in particular to a low-illumination unmanned aerial vehicle aerial image enhancement method and device.
Background
The unmanned plane has the advantages of small volume, flexible maneuvering, good concealment, convenient transportation and carrying, capability of rapidly reaching the site to develop work, beyond-vision automatic driving function, and basically no influence of severe environmental conditions and emergency during work. The unmanned aerial vehicle has various task functions, can quickly carry various police equipment according to different task demands, can return acquired data information in real time at the present stage, and can further enable the unmanned aerial vehicle to have automatic and intelligent sensing capability on the environment in future research, so that the unmanned aerial vehicle becomes a medium-hard force for striking criminals in investigation activities such as real-time patrol, effective communication, tracking investigation, weapon striking, target locking, contact cutting and the like.
Under the condition of good illumination condition, the imaging equipment carried by the unmanned aerial vehicle can acquire image data with ideal image quality. However, in the case of insufficient illumination (such as at night, in a large-area shadow area, etc.), the quality of the image acquired by the unmanned aerial vehicle visible light imaging device is seriously degraded, so that the detail information of the image and even the target information are often lost, the visual quality of the image is seriously affected, and the subsequent high-level visual tasks such as target identification and target tracking are given. Therefore, research on the low-illumination unmanned aerial vehicle aerial image enhancement processing technology is carried out, quality of acquiring the low-illumination unmanned aerial vehicle aerial image is effectively improved, detail information of the image is highlighted, good image quality of a target under the low-illumination condition is ensured, and the method has important significance for improving comprehensive efficiency of the unmanned aerial vehicle under the low-illumination condition.
The commonly used low-light image enhancement algorithm can be classified into a histogram equalization method, a method based on the Retinex theory, a method based on deep learning, an image fusion method and the like.
However, the conventional algorithm has poor effect on aerial image enhancement of the unmanned aerial vehicle. Such as: both the histogram equalization algorithm and the Retinex theory-based method use only a single low-illumination image as input, but the effective information contained in the single low-illumination image is limited after all, so that the algorithms cannot effectively display all details in the image; the deep learning-based method can obtain a better enhancement effect, but the performance of the method is still limited to a certain extent, and the main reason is that the acquisition of an image data set for training a model is difficult; the image fusion algorithm has high efficiency and good enhancement effect, however, the image fusion algorithm only adopts a pseudo exposure image with moderate exposure as the complementary image of the low-illumination image, the available information provided by the image fusion algorithm is still not comprehensive enough, and in the image with moderate exposure, the underexposed area still needs to be enhanced, so that more images are needed to participate in fusion in order to better promote the enhancement effect of the image.
Disclosure of Invention
The application provides a low-illumination unmanned aerial vehicle aerial image enhancement method and device, so as to more effectively improve the quality of the low-illumination unmanned aerial vehicle aerial image.
The technical scheme adopted by the application is as follows:
in a first aspect, the present invention provides a method for enhancing aerial images of a low-illuminance unmanned aerial vehicle, including:
acquiring the low-illumination unmanned aerial vehicle aerial image in an RGB format;
converting the RGB format low-illumination unmanned aerial vehicle aerial image into an HSV format low-illumination unmanned aerial vehicle aerial image, wherein each pixel point of the HSV format low-illumination unmanned aerial vehicle aerial image comprises a tone component H, a saturation component S and a brightness component V;
preprocessing a brightness component V, and acquiring an overexposed image corresponding to the low-illumination unmanned aerial vehicle aerial image based on the preprocessed brightness component V, hue component H and saturation component S;
generating a medium exposure image based on the low-illumination unmanned aerial vehicle aerial image and the overexposed image, wherein the generated medium exposure image adopts an exposure interpolation algorithm;
respectively designing corresponding fusion weight functions for the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image, and fusing the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image by utilizing the fusion weight functions to obtain fused images, wherein the fusion adopts a multi-scale feature fusion algorithm;
and carrying out detail enhancement on the fused image to obtain an enhanced unmanned aerial vehicle aerial image, wherein the detail enhancement adopts a multi-scale detail enhancement algorithm.
In an implementation manner, the preprocessing is performed on the brightness component V, and the overexposed image corresponding to the low-illumination unmanned aerial vehicle aerial image is obtained based on the preprocessed brightness component V, the hue component H and the saturation component S, which includes:
acquiring the optimal exposure rate of the low-illumination unmanned aerial vehicle aerial image in the HSV format;
based on the optimal exposure rate, performing virtual exposure processing on the brightness component of the low-illumination image by utilizing a brightness mapping function to obtain a preprocessed brightness component V;
and re-synthesizing an HSV format image according to the brightness component V, the initial tone component H and the saturation component S after the preprocessing of the low-illumination unmanned aerial vehicle aerial image, and converting the HSV format image back to an RGB format image to serve as an overexposure image.
In one implementation, obtaining the optimal exposure rate of the low-light unmanned aerial vehicle aerial image includes:
acquiring a pixel gray value set of underexposure of the low-illumination unmanned aerial vehicle aerial image;
acquiring underexposed image pixel point information entropy according to the pixel gray value set;
and solving and obtaining the optimal exposure rate of the low-illumination unmanned aerial vehicle aerial photo by utilizing the maximum value of the information entropy of the image pixel points.
In an implementation manner, the generating a medium exposure image based on the fusion of the low-illumination unmanned aerial vehicle aerial image and the overexposed image includes:
generating a first intermediate virtual image and a second intermediate virtual image corresponding to the low-illumination unmanned aerial vehicle aerial image and the overexposed image through an exposure interpolation algorithm;
and fusing the first intermediate virtual image and the second intermediate virtual image based on a weighted fusion algorithm to generate a medium exposure image.
In an implementation manner, a corresponding fusion weight function is designed for the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposed image respectively, and the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposed image are fused by using the fusion weight function, which includes:
respectively acquiring illumination components of each pixel point in the low-illumination unmanned aerial vehicle aerial image, the overexposure image and the medium exposure image, wherein the illumination components are obtained by firstly acquiring a brightness component V of each pixel point in the low-illumination unmanned aerial vehicle aerial image, the overexposure image and the medium exposure image, and then carrying out smoothing filtering treatment on the brightness component V through a weighted least square filter;
acquiring a brightness weight function of a low-illumination unmanned aerial vehicle aerial image, an overexposure image and a medium-exposure image according to the illumination component of each pixel point;
and based on the brightness weight function, adopting a multi-scale fusion algorithm to fuse the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposed image.
In an implementation manner, the detail enhancement is performed on the fused image to obtain an enhanced unmanned aerial vehicle aerial image, which includes:
acquiring a detail image corresponding to the fused image;
and carrying out fusion processing on the detail image and the fused image to obtain an enhanced unmanned aerial vehicle aerial image.
In one implementation manner, acquiring a detail image corresponding to the fused image includes:
smoothing and filtering the fused image by adopting a multi-scale Gaussian filter to obtain three different Gaussian blur images;
acquiring fine details, middle details and coarse details respectively corresponding to three different Gaussian blur images;
and carrying out weighted fusion on the fine detail, the middle detail and the thick and thin section to obtain a detail image.
In a second aspect, the present invention further provides a low-illuminance unmanned aerial vehicle aerial image enhancement device, including:
the acquisition module is used for acquiring the low-illumination unmanned aerial vehicle aerial image in the RGB format;
the conversion module is used for converting the RGB format low-illumination unmanned aerial vehicle aerial image into an HSV format low-illumination unmanned aerial vehicle aerial image, wherein each pixel point of the HSV format low-illumination unmanned aerial vehicle aerial image comprises a tone component H, a saturation component S and a brightness component V;
the overexposure image acquisition unit is used for preprocessing the brightness component V and acquiring an overexposure image corresponding to the low-illumination unmanned aerial vehicle aerial image based on the preprocessed brightness component V, the preprocessed tone component H and the preprocessed saturation component S;
the medium exposure image acquisition unit is used for generating a medium exposure image based on the low-illumination unmanned aerial vehicle aerial image and the overexposed image, and the generated medium exposure image adopts an exposure interpolation algorithm;
the fusion image acquisition unit is used for respectively designing corresponding fusion weight functions for the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image, and fusing the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image by utilizing the fusion weight functions to obtain fused images, wherein the fusion adopts a multi-scale feature fusion algorithm;
the enhanced image acquisition unit is used for carrying out detail enhancement on the fused image to obtain an enhanced unmanned aerial vehicle aerial image, and the detail enhancement adopts a multi-scale detail enhancement algorithm.
In a third aspect, the present invention also provides a computer apparatus, comprising:
one or more processors;
a memory for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the low-light unmanned aerial vehicle aerial image enhancement method as described above.
In a fourth aspect, the present invention also provides a computer readable medium having stored thereon a computer program, characterized in that the program when executed by a processor implements a low-illuminance unmanned aerial vehicle aerial image enhancement method as described above.
The technical scheme of the application has the following beneficial effects:
according to the method and the device for enhancing the low-illumination unmanned aerial vehicle aerial image, firstly, the low-illumination unmanned aerial vehicle aerial image in an RGB format is obtained, the low-illumination unmanned aerial vehicle aerial image in the RGB format is converted from the RGB format to an HSV format, the brightness component V is preprocessed, and the overexposure image corresponding to the low-illumination unmanned aerial image is obtained based on the preprocessed brightness component V, hue component H and saturation component S; then, performing exposure interpolation processing based on the low-illumination unmanned aerial vehicle aerial image and the overexposed image to generate a medium exposure image; then, constructing a weight function for the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposed image, and fusing the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposed image to obtain a fused image; and finally, carrying out detail enhancement on the fused image through a multi-scale detail enhancement algorithm to obtain a final enhanced image. The method can effectively improve the quality of the aerial image of the low-illumination unmanned aerial vehicle.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a flow chart of a method for enhancing aerial images of a low-illuminance unmanned aerial vehicle according to the present invention;
FIG. 2a is an original color low-intensity aerial image of a drone;
FIG. 2b is an enhanced image of the original color low-intensity aerial unmanned aerial vehicle of FIG. 2 a;
FIG. 3a is a still further original color low-intensity aerial image of a drone;
FIG. 3b is an enhanced image of the original color low-intensity aerial unmanned aerial vehicle of FIG. 3 a;
FIG. 4a is another original color low-intensity aerial image of a drone;
fig. 4b is an enhanced image of the original color low-intensity aerial unmanned aerial vehicle of fig. 4 a.
Detailed Description
Reference will now be made in detail to the embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The embodiments described in the examples below do not represent all embodiments consistent with the present application. Merely as examples of systems and methods consistent with some aspects of the present application as detailed in the claims.
Because the existing unmanned aerial vehicle aerial image enhancement algorithm is poor in aerial image enhancement effect on the unmanned aerial vehicle. Such as: both the histogram equalization algorithm and the Retinex theory-based method use only a single low-illumination image as input, but the effective information contained in the single low-illumination image is limited after all, so that the algorithms cannot effectively display all details in the image; the deep learning-based method can obtain a better enhancement effect, but the performance of the method is still limited to a certain extent, and the main reason is that the acquisition of an image data set for training a model is difficult; the image fusion algorithm has high efficiency and good enhancement effect, however, the image fusion algorithm only adopts a pseudo exposure image with moderate exposure as the complementary image of the low-illumination image, the available information provided by the image fusion algorithm is still not comprehensive enough, and in the image with moderate exposure, the underexposed area still needs to be enhanced, so that more images are needed to participate in fusion in order to better promote the enhancement effect of the image.
Accordingly, the application provides a low-illumination unmanned aerial vehicle aerial image enhancement method, a device, computer equipment and a computer readable medium, which are specifically described below.
In a first aspect, as shown in fig. 1, the present application provides a method for enhancing an aerial image of a low-illuminance unmanned aerial vehicle, the method comprising:
s100: the method comprises the steps of obtaining an RGB format low-illumination unmanned aerial vehicle aerial image, and converting the RGB format low-illumination unmanned aerial vehicle aerial image into an HSV format low-illumination unmanned aerial vehicle aerial image, wherein each pixel of the HSV format low-illumination unmanned aerial vehicle aerial image comprises a tone component H, a saturation component S and a brightness component V.
S200: and preprocessing the brightness component V, and acquiring an overexposed image corresponding to the low-illumination unmanned aerial vehicle aerial image based on the preprocessed brightness component V, the preprocessed hue component H and the preprocessed saturation component S.
The preprocessing in step S200 is a brightness boosting process.
Step S200 obtains an optimal exposure rate of the low-illuminance image according to a quality index (information entropy) maximization principle of the HSV format low-illuminance unmanned aerial vehicle aerial image, and then obtains a processed brightness component V in combination with a brightness mapping function, finally, re-synthesizes the HSV image according to the processed brightness component V, an original hue component H and a saturation component S, and converts the HSV image back to an image of an RGB color component as an overexposure image, that is, improves the underexposed pixel brightness according to the optimal exposure rate, and specifically includes:
s201: and acquiring an underexposed pixel gray value set in the HSV format low-illumination unmanned aerial vehicle aerial image. Further, the underexposed pixel gray value set in the low-illumination unmanned aerial vehicle aerial image in the HSV format is:
Q={V(x)|V(x)<τ} (1)
wherein (1) V (x) represents a low-illuminance image E 1 Is the gray threshold that distinguishes underexposed pixels, and is herein taken to be 0.5.
S202: and acquiring underexposed image pixel point information entropy according to the pixel gray value set.
Further, the calculation formula of the underexposed image pixel point information entropy is as follows:
Figure BDA0003397115140000051
wherein in the formula (2), p is i For each gray level i in Q.
S203: and solving and obtaining the optimal exposure rate of the low-illumination unmanned aerial vehicle aerial photo by utilizing the maximum value of the information entropy of the image pixel points, namely, the optimal exposure rate is the exposure rate when the information entropy takes the maximum value.
The optimal exposure rate formula solved by using the image pixel information entropy maximization principle is as follows:
Figure BDA0003397115140000052
wherein, g (·) in the formula (3) is a luminance mapping function, and the expression is:
Figure BDA0003397115140000053
(4) Where δ is the exposure rate, a and b are constants, a= -0.3293, b=1.1528.
S204: and based on the optimal exposure rate, performing virtual exposure processing on the brightness component V of the low-illumination image by utilizing a brightness mapping function to obtain a preprocessed brightness component V.
Based on the obtained optimal exposure delta opt For low-illumination image E using luminance mapping function 1 The brightness component V of the image is subjected to virtual exposure treatment to obtain a treated V component, and the calculation formula is as follows:
Figure BDA0003397115140000054
(5) Wherein: g (·) is a luminance mapping function, and Δδ is an exposure rate variation.
S205: and re-synthesizing an HSV format image according to the brightness component V, the initial tone component H and the saturation component S after the preprocessing of the low-illumination unmanned aerial vehicle aerial image, and converting the HSV format image back to an RGB format image to serve as an overexposure image.
S300: and generating a medium exposure image based on the low-illumination unmanned aerial vehicle aerial image and the overexposed image, wherein an exposure interpolation algorithm is adopted for the generated medium exposure image.
Further, step S300 generates a corresponding first intermediate virtual image and a corresponding second intermediate virtual image by an exposure interpolation algorithm based on the low-illuminance unmanned aerial vehicle aerial image and the overexposed image; and fusing the first intermediate virtual image and the second intermediate virtual image based on a weighted fusion algorithm to generate a medium exposure image. The method comprises the steps of generating a medium exposure image by fusing a low-illumination unmanned aerial vehicle aerial image and an overexposure image by using an exposure interpolation algorithm, wherein on one hand, the brightness and the contrast of the image can be better improved, and on the other hand, the problem of color distortion in a virtual exposure image caused by one-to-many mapping can be effectively avoided, and the method specifically comprises the following steps:
s301: and acquiring the exposure rates of the low-illumination unmanned aerial vehicle aerial image and the overexposed image.
S302: and acquiring the exposure rate of the medium exposure image according to the exposure rates of the low-illumination unmanned aerial vehicle aerial image and the overexposed image.
Assume a low-light image E of the same scene 1 (p) and overexposed image E 3 The exposure times of (p) are respectively Deltat 1 And Deltat 3 (Δt 3 >Δt 1 ) The exposure time deltat of the medium exposure image 2 The definition formula is as follows:
Δt 2 =(Δt 1 Δt 3 ) 0.5 (6)。
s303: a relationship between the medium exposure image and the low-light aerial image and the overexposed image is determined.
Let g 32 (. Cndot.) represents an overexposed image E 3 (p) and Medium Exposure image E 2 (p) luminance mapping function between g 12 (. Cndot.) represents a low-light image E 1 (p) and Medium Exposure image E 2 (p) luminance mapping function between them, g 32 (. Cndot.) and g 12 (. Cndot.) is defined as:
Figure BDA0003397115140000061
(7) Where F (-) represents the camera response function.
S304: and acquiring a first intermediate virtual image according to the relation between the medium exposure image and the low-illumination unmanned aerial vehicle aerial image, and acquiring a second intermediate virtual image according to the relation between the medium exposure image and the overexposed image.
Using the low-illuminance image and the overexposed image as input data, two intermediate virtual images having the same exposure time are generated using equation (7):
Figure BDA0003397115140000062
(8) In the method, in the process of the invention,
Figure BDA0003397115140000063
acquiring a first intermediate virtual image according to the relation between the medium exposure image and the low-illumination unmanned aerial vehicle aerial image; />
Figure BDA0003397115140000064
A second intermediate virtual image is acquired from the relationship between the intermediate exposure image and the overexposed image.
S305: and fusing the first intermediate virtual image and the second intermediate virtual image based on a weighted fusion algorithm to generate a medium exposure image.
The two generated intermediate virtual images are fused through a weighted fusion algorithm to obtain a medium exposure image, and the calculation formula is as follows:
Figure BDA0003397115140000065
here the weight functionw 1 And w 2 Respectively defined as:
Figure BDA0003397115140000071
/>
Figure BDA0003397115140000072
wherein, xi L And xi U Is constant, h 1 (z) and h 2 (z) is defined as:
Figure BDA0003397115140000073
s400: and respectively designing corresponding fusion weight functions for the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image, and fusing the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image by utilizing the fusion weight functions to obtain fused images, wherein the fusion adopts a multi-scale feature fusion algorithm.
Specifically, step S400 optimally fuses the complementarity information of the low-illuminance unmanned aerial vehicle aerial image with the overexposed image and the intermediate exposed image to obtain a more robust visual enhancement effect, and the step proposes a new brightness weight function, and fuses the low-illuminance image, the intermediate exposed image and the overexposed image by adopting a multi-scale fusion algorithm, which specifically includes the following steps:
s401: the method comprises the steps of respectively obtaining illumination components of each pixel point in a low-illumination unmanned aerial vehicle aerial image, an overexposed image and a medium exposed image, wherein the illumination components are obtained by firstly obtaining a brightness component V of each pixel point in the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium exposed image, and then carrying out smoothing filtering treatment on the brightness component V through a weighted least square filter.
In the present embodiment, L 1 、L 2 And L 3 Respectively represent the low-illumination patternsImage E 1 Medium exposure image E 2 And an overexposed image E 3 To obtain the illumination component, image E 1 、Ε 2 、E 3 From RGB color space to HSV color space, and the luminance component V of the image is acquired, and then the V component is smoothed with a weighted least squares filter (Weighted Least Square, WLS) that can hold the edges of the image to obtain the illumination component.
S402: and acquiring brightness weight functions of the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposed image according to the illumination component of each pixel point.
For low-illumination image E 1 In other words, the area with poor exposure in the image is effectively enhanced while the area with good exposure in the image is maintained; compared with the low-illumination image E 1 And medium exposure image E 2 Overexposed image E 3 More effective image content information can be displayed while losing image details, and for this reason, the embodiment adopts the Sigmoid function pair E based on illumination component 1 And E is 3 And (5) performing weight setting. From a large amount of statistical data, it is known that the pixel value distribution of the well-exposed image approximately satisfies a gaussian distribution having a mean value of 0.5 and a variance of 0.25, and thus the medium-exposure image E is set using a gaussian distribution function 2 Is a weight of (2). In order to balance the gaussian distribution function and the Sigmoid function, an improved luminance weight function is proposed, which is defined as follows:
Figure BDA0003397115140000074
Figure BDA0003397115140000075
Figure BDA0003397115140000081
in which L 1 、L 2 And L 3 Respectively represent E 1 、E 2 And E is 3 Is included in the light component of the (c).
S403: and based on the brightness weight function, adopting a multi-scale fusion algorithm to fuse the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposed image.
The fusion algorithm in this embodiment can be expressed as:
Figure BDA0003397115140000082
wherein: y is Y l And L l Representing the gaussian pyramid of the first layer and the laplacian pyramid of the first layer respectively,
Figure BDA0003397115140000083
to normalize the weights E 1 、E 2 And E is 3 Respectively a low-illumination image, a medium exposure image and an overexposure image, wherein the value of l is 5.
S500: and carrying out detail enhancement on the fused image to obtain an enhanced unmanned aerial vehicle aerial image, wherein the detail enhancement adopts a multi-scale detail enhancement algorithm.
The step S500 is to perform detail enhancement on the fused image through a multi-scale detail enhancement algorithm to obtain a final enhanced image, and specifically comprises the following steps:
s501: and adopting a multi-scale Gaussian filter to carry out smooth filtering on the fused image to obtain three different Gaussian blur images.
Smoothing and filtering the fused image by adopting a multi-scale Gaussian filter to obtain 3 different Gaussian blur images, wherein the Gaussian blur images are shown in a formula (17):
B 1 =G 1 *I * ,B 2 =G 2 *I * ,B 3 =G 3 *I * (17)
wherein G is 1 ,G 2 And G 3 Standard deviation sigma 1 =1.0,σ 2 =2.0 and σ 3 Gaussian kernel=4.0, B 1 、B 2 And B 3 Respectively refers to the image I after fusion by using three Gaussian functions with different scales * And performing smoothing filtering on the result.
S502: and acquiring fine details, middle details and coarse details respectively corresponding to the three different Gaussian blur images.
Extracting fine detail D for an image 1 Medium detail D 2 And section D 3 As shown in formula (18):
D 1 =I * -B 1 ,D 2 =B 1 -B 2 ,D 3 =B 2 -B 3 (18)
s503: and carrying out weighted fusion on the fine detail, the middle detail and the thick and thin section to obtain a detail image.
Will D 1 ,D 2 And D 3 Weighting fusion is carried out to obtain a detail image D * As shown in formula (19):
D * =(1-w 1 ×sgn(D 1 ))×D 1 +w 2 ×D 2 +w 3 ×D 3 (19)
wherein w is 1 ,w 2 And w 3 The values of the weight coefficients are respectively 0.5,0.5 and 0.25.
S504: the detail image D * With the fused diagram I * Fusion by addition, as shown in formula (20):
I final =I*+D* (20)
I final and obtaining the finally obtained enhanced aerial image of the unmanned aerial vehicle.
According to the low-illumination unmanned aerial vehicle aerial image enhancement method, firstly, an RGB-format low-illumination unmanned aerial vehicle aerial image is obtained, the RGB-format low-illumination unmanned aerial vehicle aerial image is converted into an HSV format from the RGB format, the brightness component V is preprocessed, and an overexposure image corresponding to the low-illumination unmanned aerial vehicle aerial image is obtained based on the preprocessed brightness component V, hue component H and saturation component S; then, carrying out weighted fusion on the basis of the low-illumination unmanned aerial vehicle aerial image and the overexposed image to generate a medium-exposure image; then, fusing the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposed image by adopting a multi-scale fusion characteristic algorithm to obtain a fused image; and finally, carrying out detail enhancement on the fused image through a multi-scale detail enhancement algorithm to obtain a final enhanced image. The method can effectively improve the quality of the aerial image of the low-illumination unmanned aerial vehicle.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present invention.
As a second aspect, the present invention also discloses a low-illuminance unmanned aerial vehicle aerial image enhancement device, including:
the acquisition module is used for acquiring the low-illumination unmanned aerial vehicle aerial image in the RGB format;
the conversion module is used for converting the RGB format low-illumination unmanned aerial vehicle aerial image into an HSV format low-illumination unmanned aerial vehicle aerial image, wherein each pixel point of the HSV format low-illumination unmanned aerial vehicle aerial image comprises a tone component H, a saturation component S and a brightness component V;
the overexposure image acquisition unit is used for preprocessing the brightness component V and acquiring an overexposure image corresponding to the low-illumination unmanned aerial vehicle aerial image based on the preprocessed brightness component V, the preprocessed tone component H and the preprocessed saturation component S;
the medium exposure image acquisition unit is used for generating a medium exposure image based on the low-illumination unmanned aerial vehicle aerial image and the overexposed image, and the generated medium exposure image adopts an exposure interpolation algorithm;
the fusion image acquisition unit is used for respectively designing corresponding fusion weight functions for the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image, and fusing the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image by utilizing the fusion weight functions to obtain fused images, wherein the fusion adopts a multi-scale feature fusion algorithm;
the enhanced image acquisition unit is used for carrying out detail enhancement on the fused image to obtain an enhanced unmanned aerial vehicle aerial image, and the detail enhancement adopts a multi-scale detail enhancement algorithm.
It should be noted that: when the aerial image enhancement of the low-illumination unmanned aerial vehicle is performed, only the division of the functional modules is used for illustration, and in practical application, the functional distribution can be completed by different functional modules according to the needs, namely, the internal structure of the device is divided into different functional modules so as to complete all or part of the functions described above. In addition, the low-illumination unmanned aerial vehicle aerial image enhancement device and the low-illumination unmanned aerial vehicle aerial image enhancement method provided in the above embodiments belong to the same conception, and detailed implementation processes thereof are shown in the method embodiments and are not repeated here.
The limitation of the low-illumination unmanned aerial vehicle aerial image enhancement device can be referred to above, and the description of the limitation of the low-illumination unmanned aerial vehicle aerial image enhancement method is omitted here. In addition, each module in the low-illumination unmanned aerial vehicle aerial image enhancement device can be fully or partially realized by software, hardware and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
As a third aspect, the present invention also discloses a computer device, which may be a server. The computer device includes: one or more processors for providing computing and control capabilities; a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to perform the steps of: acquiring the low-illumination unmanned aerial vehicle aerial image in an RGB format; converting the RGB format low-illumination unmanned aerial vehicle aerial image into an HSV format low-illumination unmanned aerial vehicle aerial image, wherein each pixel point of the HSV format low-illumination unmanned aerial vehicle aerial image comprises a tone component H, a saturation component S and a brightness component V; preprocessing a brightness component V, and acquiring an overexposed image corresponding to the low-illumination unmanned aerial vehicle aerial image based on the preprocessed brightness component V, hue component H and saturation component S; generating a medium exposure image based on the low-illumination unmanned aerial vehicle aerial image and the overexposed image, wherein the generated medium exposure image adopts an exposure interpolation algorithm; respectively designing corresponding fusion weight functions for the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image, and fusing the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image by utilizing the fusion weight functions to obtain fused images, wherein the fusion adopts a multi-scale feature fusion algorithm; and carrying out detail enhancement on the fused image to obtain an enhanced unmanned aerial vehicle aerial image, wherein the detail enhancement adopts a multi-scale detail enhancement algorithm.
As a fourth aspect, the present invention also discloses a computer-readable medium having a computer program stored thereon, which may be included in the apparatus described in the above embodiment or may exist alone without being assembled into the apparatus. The above program is executed by the processor to: acquiring the low-illumination unmanned aerial vehicle aerial image in an RGB format; converting the RGB format low-illumination unmanned aerial vehicle aerial image into an HSV format low-illumination unmanned aerial vehicle aerial image, wherein each pixel point of the HSV format low-illumination unmanned aerial vehicle aerial image comprises a tone component H, a saturation component S and a brightness component V; preprocessing a brightness component V, and acquiring an overexposed image corresponding to the low-illumination unmanned aerial vehicle aerial image based on the preprocessed brightness component V, hue component H and saturation component S; generating a medium exposure image based on the low-illumination unmanned aerial vehicle aerial image and the overexposed image, wherein the generated medium exposure image adopts an exposure interpolation algorithm; respectively designing corresponding fusion weight functions for the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image, and fusing the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image by utilizing the fusion weight functions to obtain fused images, wherein the fusion adopts a multi-scale feature fusion algorithm; and carrying out detail enhancement on the fused image to obtain an enhanced unmanned aerial vehicle aerial image, wherein the detail enhancement adopts a multi-scale detail enhancement algorithm.
It is noted that relational terms such as "first" and "second", and the like, are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that an article or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The foregoing is merely a specific embodiment of the application to enable one skilled in the art to understand or practice the application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
It will be understood that the present application is not limited to what has been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. The method for enhancing the aerial image of the low-illumination unmanned aerial vehicle is characterized by comprising the following steps of:
acquiring the low-illumination unmanned aerial vehicle aerial image in an RGB format;
converting the RGB format low-illumination unmanned aerial vehicle aerial image into an HSV format low-illumination unmanned aerial vehicle aerial image, wherein each pixel point of the HSV format low-illumination unmanned aerial vehicle aerial image comprises a tone component H, a saturation component S and a brightness component V;
preprocessing a brightness component V, and acquiring an overexposed image corresponding to the low-illumination unmanned aerial vehicle aerial image based on the preprocessed brightness component V, hue component H and saturation component S;
generating a medium exposure image based on the low-illumination unmanned aerial vehicle aerial image and the overexposed image, wherein the generated medium exposure image adopts an exposure interpolation algorithm;
respectively designing corresponding fusion weight functions for the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image, and fusing the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image by utilizing the fusion weight functions to obtain fused images, wherein the fusion adopts a multi-scale feature fusion algorithm;
and carrying out detail enhancement on the fused image to obtain an enhanced unmanned aerial vehicle aerial image, wherein the detail enhancement adopts a multi-scale detail enhancement algorithm.
2. The method for enhancing a low-illuminance unmanned aerial vehicle aerial image according to claim 1, wherein preprocessing the luminance component V, and acquiring an overexposed image corresponding to the low-illuminance unmanned aerial image based on the preprocessed luminance component V and the hue component H and the saturation component S, comprises:
acquiring the optimal exposure rate of the low-illumination unmanned aerial vehicle aerial image in the HSV format;
based on the optimal exposure rate, performing virtual exposure processing on the brightness component of the low-illumination image by utilizing a brightness mapping function to obtain a preprocessed brightness component V;
and re-synthesizing an HSV format image according to the brightness component V, the initial tone component H and the saturation component S after the preprocessing of the low-illumination unmanned aerial vehicle aerial image, and converting the HSV format image back to an RGB format image to serve as an overexposure image.
3. The method of claim 2, wherein obtaining the optimal exposure of the low-intensity aerial image comprises:
acquiring a pixel gray value set of underexposure of the low-illumination unmanned aerial vehicle aerial image;
acquiring underexposed image pixel point information entropy according to the pixel gray value set;
and solving and obtaining the optimal exposure rate of the low-illumination unmanned aerial vehicle aerial photo by utilizing the maximum value of the information entropy of the image pixel points.
4. The method of claim 1, wherein generating a medium exposure image based on fusion of the low-illuminance unmanned aerial vehicle aerial image and the overexposed image comprises:
generating a first intermediate virtual image and a second intermediate virtual image corresponding to the low-illumination unmanned aerial vehicle aerial image and the overexposed image through an exposure interpolation algorithm;
and fusing the first intermediate virtual image and the second intermediate virtual image based on a weighted fusion algorithm to generate a medium exposure image.
5. The method of claim 1, wherein designing corresponding fusion weight functions for the low-illuminance unmanned aerial vehicle aerial image, the overexposed image, and the medium-exposure image, respectively, and fusing the low-illuminance unmanned aerial vehicle aerial image, the overexposed image, and the medium-exposure image using the fusion weight functions, comprises:
respectively acquiring illumination components of each pixel point in the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposed image, wherein the illumination components are obtained by firstly acquiring a brightness component V of each pixel point in the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposed image, and then carrying out smoothing filtering treatment on the brightness component V through a weighted least square filter;
acquiring brightness weight functions of a low-illumination unmanned aerial vehicle aerial image, an overexposure image and a medium-exposure image according to the illumination component of each pixel point;
and based on the brightness weight function, adopting a multi-scale fusion algorithm to fuse the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposed image.
6. The method for enhancing the aerial image of the low-illumination unmanned aerial vehicle according to claim 1, wherein the detail enhancement is performed on the fused image to obtain the enhanced aerial image of the unmanned aerial vehicle, comprising:
acquiring a detail image corresponding to the fused image;
and fusing the detail image and the fused image together to obtain the enhanced unmanned aerial vehicle aerial image.
7. The method for enhancing an aerial image of a low-illuminance unmanned aerial vehicle according to claim 6, wherein acquiring a detail image corresponding to the fused image includes:
smoothing and filtering the fused image by adopting a multi-scale Gaussian filter to obtain three different Gaussian blur images;
acquiring fine details, middle details and coarse details respectively corresponding to three different Gaussian blur images;
and carrying out weighted fusion on the fine detail, the middle detail and the thick and thin section to obtain a detail image.
8. A low-light unmanned aerial vehicle aerial image enhancement device, comprising:
the acquisition module is used for acquiring the low-illumination unmanned aerial vehicle aerial image in the RGB format;
the conversion module is used for converting the RGB format low-illumination unmanned aerial vehicle aerial image into an HSV format low-illumination unmanned aerial vehicle aerial image, wherein each pixel point of the HSV format low-illumination unmanned aerial vehicle aerial image comprises a tone component H, a saturation component S and a brightness component V;
the overexposure image acquisition unit is used for preprocessing the brightness component V and acquiring an overexposure image corresponding to the low-illumination unmanned aerial vehicle aerial image based on the preprocessed brightness component V, the preprocessed tone component H and the preprocessed saturation component S;
the medium exposure image acquisition unit is used for generating a medium exposure image based on the low-illumination unmanned aerial vehicle aerial image and the overexposed image, and the generated medium exposure image adopts an exposure interpolation algorithm;
the fusion image acquisition unit is used for respectively designing corresponding fusion weight functions for the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image, and fusing the low-illumination unmanned aerial vehicle aerial image, the overexposed image and the medium-exposure image by utilizing the fusion weight functions to obtain fused images, wherein the fusion adopts a multi-scale feature fusion algorithm;
the enhanced image acquisition unit is used for carrying out detail enhancement on the fused image to obtain an enhanced unmanned aerial vehicle aerial image, and the detail enhancement adopts a multi-scale detail enhancement algorithm.
9. A computer device, comprising:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the low-light unmanned aerial vehicle aerial image enhancement method of any of claims 1 to 7.
10. A computer readable medium having stored thereon a computer program, wherein the program when executed by a processor implements the low-illuminance unmanned aerial vehicle aerial image enhancement method of any of claims 1 to 7.
CN202111484809.2A 2021-12-07 2021-12-07 Low-illumination unmanned aerial vehicle aerial image enhancement method and device Pending CN116245772A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111484809.2A CN116245772A (en) 2021-12-07 2021-12-07 Low-illumination unmanned aerial vehicle aerial image enhancement method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111484809.2A CN116245772A (en) 2021-12-07 2021-12-07 Low-illumination unmanned aerial vehicle aerial image enhancement method and device

Publications (1)

Publication Number Publication Date
CN116245772A true CN116245772A (en) 2023-06-09

Family

ID=86631781

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111484809.2A Pending CN116245772A (en) 2021-12-07 2021-12-07 Low-illumination unmanned aerial vehicle aerial image enhancement method and device

Country Status (1)

Country Link
CN (1) CN116245772A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117451012A (en) * 2023-12-25 2024-01-26 威海市三维工程测绘有限公司 Unmanned aerial vehicle aerial photography measurement method and system
CN118293885A (en) * 2024-06-04 2024-07-05 新坐标科技有限公司 Unmanned aerial vehicle photovoltaic power station terrain three-dimensional mapping method and control system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117451012A (en) * 2023-12-25 2024-01-26 威海市三维工程测绘有限公司 Unmanned aerial vehicle aerial photography measurement method and system
CN117451012B (en) * 2023-12-25 2024-03-05 威海市三维工程测绘有限公司 Unmanned aerial vehicle aerial photography measurement method and system
CN118293885A (en) * 2024-06-04 2024-07-05 新坐标科技有限公司 Unmanned aerial vehicle photovoltaic power station terrain three-dimensional mapping method and control system

Similar Documents

Publication Publication Date Title
US11882357B2 (en) Image display method and device
CN111402146B (en) Image processing method and image processing apparatus
CN112581379A (en) Image enhancement method and device
CN112541877B (en) Defuzzification method, system, equipment and medium for generating countermeasure network based on condition
CN116245772A (en) Low-illumination unmanned aerial vehicle aerial image enhancement method and device
CN112602088B (en) Method, system and computer readable medium for improving quality of low light images
CN115861380B (en) Method and device for tracking visual target of end-to-end unmanned aerial vehicle under foggy low-illumination scene
CN112272832A (en) Method and system for DNN-based imaging
Zheng et al. Low-light image and video enhancement: A comprehensive survey and beyond
CN110717864B (en) Image enhancement method, device, terminal equipment and computer readable medium
Wu et al. Reflectance-guided histogram equalization and comparametric approximation
Chen et al. Retinex low-light image enhancement network based on attention mechanism
CN114926367A (en) Method for generating hologram of coal mine degraded image
CN115035011A (en) Low-illumination image enhancement method for self-adaptive RetinexNet under fusion strategy
Guo et al. Reinforced depth-aware deep learning for single image dehazing
Pan et al. ChebyLighter: Optimal Curve Estimation for Low-light Image Enhancement
Desai et al. Lightnet: Generative model for enhancement of low-light images
Li et al. LDNet: low-light image enhancement with joint lighting and denoising
CN115063301A (en) Video denoising method, video processing method and device
EP4383183A1 (en) Data processing method and apparatus
CN114926348B (en) Device and method for removing low-illumination video noise
Park et al. Enhancing underwater color images via optical imaging model and non-local means denoising
WO2022193132A1 (en) Image detection method and apparatus, and electronic device
Nair et al. Benchmarking single image dehazing methods
Cyganek et al. Virtual high dynamic range imaging for underwater drone navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination