CN116258644A - Image enhancement method, device, computer equipment and storage medium - Google Patents

Image enhancement method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN116258644A
CN116258644A CN202310065902.2A CN202310065902A CN116258644A CN 116258644 A CN116258644 A CN 116258644A CN 202310065902 A CN202310065902 A CN 202310065902A CN 116258644 A CN116258644 A CN 116258644A
Authority
CN
China
Prior art keywords
brightness
map
image
value
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310065902.2A
Other languages
Chinese (zh)
Inventor
徐一丁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Glenfly Tech Co Ltd
Original Assignee
Glenfly Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Glenfly Tech Co Ltd filed Critical Glenfly Tech Co Ltd
Priority to CN202310065902.2A priority Critical patent/CN116258644A/en
Publication of CN116258644A publication Critical patent/CN116258644A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present application relates to an image enhancement method, apparatus, computer device, storage medium and computer program product. The method comprises the following steps: acquiring a color image to be processed; performing brightness extraction on the color image to be processed to obtain an initial brightness map; calculating based on the initial brightness map to obtain a brightness enhancement value; and obtaining a brightness enhancement image according to the brightness enhancement value and the color image to be processed. The method can be used for image definition.

Description

Image enhancement method, device, computer equipment and storage medium
Technical Field
The present application relates to the field of image technology, and in particular, to an image enhancement method, an image enhancement apparatus, a computer device, a storage medium, and a computer program product.
Background
With the popularity of high definition display devices, the resolution of user display devices has generally increased to levels of 2K and higher. Most video resolutions today are still 1080p, requiring up-sampling of the video picture to the resolution of the display. The quality of the image upsampling directly affects the effect of the final display of the video.
The upsampling algorithm in the related art can be based on interpolation, reconstruction and learning methods, wherein the interpolation is the lowest in use cost and the widest in range, but the image definition after upsampling is lower based on interpolation.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an image enhancement method, apparatus, computer device, computer-readable storage medium, and computer program product that can improve image sharpness.
In a first aspect, the present application provides an image enhancement method. The method comprises the following steps:
acquiring a color image to be processed;
performing brightness extraction on the color image to be processed to obtain an initial brightness map;
calculating based on the initial brightness map to obtain a brightness enhancement value;
and obtaining a brightness enhancement image according to the brightness enhancement value and the color image to be processed.
In one embodiment, the performing brightness extraction on the color image to be processed to obtain an initial brightness map includes:
acquiring a pixel extraction coefficient;
and obtaining the initial brightness map according to the pixel extraction coefficient and the color image to be processed.
In one embodiment, the calculating based on the initial luminance map to obtain the luminance enhancement value includes:
filtering based on the initial brightness map and the filter coefficient to obtain a target brightness map;
and obtaining the brightness enhancement value according to the target brightness map and the initial brightness map.
In one embodiment, the obtaining the brightness enhancement value according to the target brightness map and the initial brightness map includes:
performing difference making on the target brightness map and the initial brightness map to obtain a difference map;
and obtaining the brightness enhancement value according to the difference value diagram and the sharpening coefficient.
In one embodiment, the filtering based on the initial luminance map and the filter coefficient to obtain a target luminance map includes:
according to the filter coefficient, filtering each pixel in the initial brightness map in different directions to obtain a filter pixel value corresponding to each pixel;
and combining the brightness value of each pixel and the filtered pixel value to obtain the target brightness map.
In one embodiment, the calculating method of the filter coefficient includes:
transforming according to the initial brightness map to obtain a guide output map;
obtaining an objective function according to the guide output graph and the initial brightness value;
and solving the objective function to obtain the filter coefficient.
In one embodiment, the obtaining the brightness enhancement image according to the brightness enhancement value and the color image to be processed includes:
amplifying the color image to be processed to obtain an amplified image;
and adding each pixel in the amplified image and the brightness enhancement value to obtain the brightness enhancement image.
In a second aspect, the present application also provides an image enhancement apparatus. The device comprises:
the acquisition module is used for acquiring the color image to be processed;
the extraction module is used for extracting the brightness of the color image to be processed to obtain an initial brightness image;
the calculation module is used for calculating based on the initial brightness map to obtain a brightness enhancement value;
and the enhancement module is used for obtaining a brightness enhancement image according to the brightness enhancement value and the color image to be processed.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the steps of the method of any of the embodiments described above when the processor executes the computer program.
In a fourth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method of any of the embodiments described above.
In a fifth aspect, the present application also provides a computer program product. The computer program product comprising a computer program which, when executed by a processor, implements the steps of the method of any of the embodiments described above.
The image enhancement method, the device, the computer equipment, the storage medium and the computer program product can firstly carry out brightness extraction on the color image to be processed by the server to obtain an initial brightness image of a single channel, so that the calculated amount can be reduced, and the color cast problem can be avoided on the subsequent texture extraction; second, the calculated brightness enhancement value is added to the RGB three channels, so that the image can be kept high in definition.
Drawings
FIG. 1 is a diagram of an application environment for an image enhancement method in one embodiment;
FIG. 2 is a schematic diagram of a filter in different directions in one embodiment;
fig. 3 (a) is a brightness enhancement image with the smallest absolute difference;
fig. 3 (b) is a brightness enhancement image with the largest absolute difference;
FIG. 4 (a) is a schematic diagram of a brightness enhancement image in one embodiment;
FIG. 4 (b) is a schematic diagram of a brightness enhancement image according to another embodiment;
FIG. 5 is a block diagram of an image enhancement device in one embodiment;
fig. 6 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, there is provided an image enhancement method including the steps of:
s102, acquiring a color image to be processed.
The color image is an image having various colors, such as an RGB image, obtained by overlapping three color channels of red, green, and blue with each other.
S104, carrying out brightness extraction on the color image to be processed to obtain an initial brightness map.
Alternatively, brightness extraction can be performed on three channels of the color image to be processed, so as to obtain a brightness value corresponding to each pixel point, and further obtain an initial brightness map. Wherein the initial luminance map refers to an image obtained after luminance extraction of a color image to be processed, which is used to characterize the brightness condition of the image.
Alternatively, three channels may be extracted with a certain pixel extraction coefficient, for example, three channels may be extracted by 20% respectively and added to obtain a corresponding luminance value. Illustratively, assuming that the current pixel point is represented as (250,251,252), the luminance value is: 0.2×250+0.2×251+252×0.2=150.6. In other embodiments, the extraction coefficients may be different for each channel of the color image to be processed for each pixel.
Alternatively, luminance extraction may be performed by a conversion function built in the server, such as the rgb2gray function.
And S106, calculating based on the initial brightness map to obtain a brightness enhancement value.
Optionally, the server may calculate the initial luminance map to obtain a target luminance map, and then obtain a luminance enhancement value corresponding to each pixel according to a difference between the target luminance map and the initial luminance map. The target luminance map refers to an image which retains the required texture content after performing operations such as linear transformation and filtering on the initial luminance map.
Optionally, the server may filter the initial luminance map through a different filter, so as to obtain a target luminance map retaining the texture content. For example, the initial luminance map may be filtered by a side window filter, or the initial luminance map may be filtered by a side window filter and a guide filter to obtain the target luminance map.
Alternatively, the server may subtract the target luminance map from the initial luminance map to obtain the luminance enhancement value. For example, if the target luminance value of a certain pixel point in the target luminance map is a, and the initial luminance value of the pixel point in the initial luminance map is b, the luminance enhancement value of the pixel point is a-b.
S108, obtaining a brightness enhancement image according to the brightness enhancement value and the color image to be processed.
Specifically, after obtaining the brightness enhancement value, the server adds the brightness enhancement value to the three channel correspondence corresponding to each pixel, thereby obtaining the brightness enhancement image. Exemplary, assuming opY is a luminance enhancement value, then
R+= opY; g+= opY; b+= opY formula (1)
The luminance enhanced image can be obtained by the formula (1).
In the image enhancement method, firstly, the server performs brightness extraction on the color image to be processed to obtain the initial brightness image of a single channel, so that the calculated amount can be reduced, and the color cast problem can be avoided in the subsequent texture extraction; second, the calculated brightness enhancement value is added to the RGB three channels, so that the image can be kept high in definition.
In one embodiment, performing luminance extraction on a color image to be processed to obtain an initial luminance map includes: acquiring a pixel extraction coefficient; and obtaining an initial brightness map according to the pixel extraction coefficient and the color image to be processed.
The pixel extraction coefficient refers to a preset coefficient for extracting brightness of a pixel, and may be a specific value or a set. When the extraction coefficients extracted for the respective channels are all the same, the pixel extraction coefficient may be a specific one, for example, 0.2126; when the ratio is different for each channel, it may be a set, e.g., { a, b, c }, where a, b, c represent the extraction coefficients for the R, G, B three channels, respectively.
Optionally, the server acquires pixel extraction coefficients, and performs brightness extraction on each pixel point in the color image to be processed according to the pixel extraction coefficients to obtain brightness values corresponding to each pixel point, thereby obtaining an initial brightness map.
Alternatively, the server may obtain the corresponding pixel extraction coefficients according to the color gamut type of the display device and the video.
Illustratively, when the color gamut type of the display device and the video is ITU bt.709, the pixel extraction coefficient is {0.2126,0.7152,0.0722}, and the initial luminance value Y may be calculated according to the formula (2), which is specifically calculated as follows:
y=0.2126×r+0.7152×g+0.0722×b formula (2)
For example, when the color gamut type of the display device and the video is ITU bt.601, the pixel extraction coefficient is {0.299,0.587,0.144}, and the coefficients corresponding to R, G and B in the formula (2) before are respectively modified to 0.299,0.587 and 0.144, so as to calculate an initial luminance value Y when the color gamut type of the display device and the video is ITU bt.601.
In the above embodiment, the luminance value corresponding to the color image to be processed can be accurately calculated through the pixel extraction coefficient, so that the initial luminance map of the color image to be processed can be accurately represented.
In one embodiment, the computing based on the initial luminance map to obtain the luminance enhancement value includes: filtering based on the initial brightness map and the filter coefficient to obtain a target brightness map; and obtaining a brightness enhancement value according to the target brightness map and the initial brightness map.
Wherein the filter coefficients are pre-calculated values for guiding the filter to perform a linear transformation, the specific calculation process may be referred to as the following embodiment for calculating the filter coefficients.
Alternatively, the server may first perform linear transformation on each pixel point in the initial luminance map according to the filter coefficient, and then perform filtering through a filter. For example, a side window filter, an average filter, or the like is used for filtering to obtain a target luminance map.
Optionally, obtaining the brightness enhancement value according to the target brightness map and the initial brightness map includes: performing difference between the target brightness map and the initial brightness map to obtain a difference map; and obtaining a brightness enhancement value according to the difference value diagram and the sharpening coefficient.
For example, the brightness enhancement value corresponding to each pixel may be calculated according to the formula (3), and the specific calculation process is as follows:
opY = (I-q) ·λ equation (3)
Wherein I is a guide graph and is the same as the initial brightness graph; q is a target luminance map, I-q is a difference map, and λ is a sharpening coefficient. Wherein the sharpening factor may control the degree of sharpening globally.
In the above embodiment, the brightness enhancement value corresponding to each pixel can be calculated by the formula (3), and the intensity of global sharpening can be controlled by the sharpening coefficient.
In one embodiment, filtering based on the initial luminance map and the filter coefficients to obtain the target luminance map includes: filtering each pixel in the initial brightness map in different directions according to the filter coefficient to obtain a filter pixel value corresponding to each pixel; and combining the brightness value of each pixel and the filtered pixel value to obtain a target brightness map.
Optionally, the server may construct filters in different directions, and then use the filters in different directions to perform filtering in different directions on each pixel in the initial luminance map according to the filter coefficients, so as to obtain filtered pixel values of each pixel corresponding to the filters in different directions. The filtered pixel value refers to a pixel value obtained by filtering each pixel.
Alternatively, the server may set the filter to a filter without a saw tooth shape, which is more convenient for traversal calculation. For example, in conjunction with fig. 2, fig. 2 is a schematic diagram of a filter in different directions in an embodiment, where the filter in fig. 2 is in a positive direction or a rectangle, and the filter in each of the eight directions is shown as left, right, up, down, north-west, north-east, south-east, and south-west.
Optionally, after the initial brightness value is filtered by using the filter in each direction, a filtered pixel value of the corresponding pixel is obtained, so that the filtered pixel value can be screened by combining the current brightness value of each pixel to obtain the target brightness map.
Optionally, the filtered pixel value corresponding to the current pixel point and the current brightness value of the current pixel point may be subjected to difference to obtain an absolute difference value, and the target brightness value is selected according to the absolute difference value, so as to obtain the target brightness map.
Optionally, the server may select, as the target luminance value, a filtered pixel having the largest or smallest absolute value of luminance with respect to the current pixel according to a retention of edge contour information in the luminance enhanced image. For example, if more edge contours need to be preserved in the brightness enhancement image, the filtered pixel point with the largest absolute value is selected as the target brightness value. Otherwise, the filtered pixel value with the smallest absolute difference is selected as the target brightness value. Specifically, the comparison can be performed with reference to fig. 3 (a) and fig. 3 (b), where fig. 3 (a) is to select the filtered pixel point with the smallest absolute difference as the target luminance value, and fig. 3 (b) is to select the filtered pixel point with the largest absolute difference as the target luminance value. Wherein more edge information is retained in fig. 3 (b) than in fig. 3 (a). Fig. 3 (a) shows a luminance enhancement image having the smallest absolute difference, and fig. 3 (b) shows a luminance enhancement image having the largest absolute difference.
For example, the server may obtain the target luminance map according to the following pseudo code, and a side window filter with a filter kernel size of 3×3 is used in this embodiment.
Figure BDA0004073606750000071
The server processes each pixel point in the brightness map by traversing the rows and columns, and then performs a and b transformation on each pixel point, wherein a and b are filter coefficients. Then, filtering operation of 8 filters is performed to obtain a filtered output value, namely conv_res [ i ], difference is performed on the filtered output value and a brightness value corresponding to a current pixel point to obtain an absolute difference abs_diff [ i ], and finally the filtered output value with the smallest or largest difference with the brightness value of the current pixel position is selected as a target brightness value, so that an output image q, namely a target brightness map, is obtained.
In the above embodiment, the server may obtain the filtered pixel value with the largest or closest difference from the original texture by configuring the side window filter, and control the texture preservation at the edge contour according to the selected filtered pixel value.
In one embodiment, the calculating method of the filtering coefficient includes: transforming the initial brightness map to obtain a guide output map; obtaining an objective function according to the guide output graph and the initial brightness value; and solving the objective function to obtain a filter coefficient.
Optionally, the server performs linear transformation on the pixel points in the initial luminance map to obtain an output image Q after linear transformation, that is, a guide output map. Illustratively, the guide map may be linearly transformed by equation (4).
Figure BDA0004073606750000081
Wherein I is a guide map, i.e. an initial luminance map, and for any position k in the image, the filter window is w k
In order to make the output image Q and the input image p substantially identical locally, the difference between the image Q and the input image p may be minimized to ensure that the locally identical. Since the input image is the initial luminance map in this embodiment, the server can construct an objective function from the initial luminance map and the guide output map, and obtain the filter coefficient by solving the objective function.
For example, the server may minimize the mean square error between the initial luminance map and the pilot output map as an objective function, as shown in equation (5).
Figure BDA0004073606750000082
Then to a k And b k And obtaining a formula (6) and a formula (7) by solving the bias derivative.
Figure BDA0004073606750000083
Figure BDA0004073606750000084
Let a k And b k The partial derivative of (a) is equal to 0, and a can be solved k And b k A and b can then be calculated from the guidance map I and the input image p, with reference to equations (8) to (11) for the specific calculation process.
Var I =f mean (I·I)-f mean (I)·f mean (I) Formula (8)
Cov Ip =f mean (I·P)-f mean (I)·f mean (p) formula (9)
Figure BDA0004073606750000091
b=f mean (p)-a·f mean (I) Formula (11)
Wherein f mean Representing mean filtering. Since the guide map I and the input image p are the same, both are initial luminance maps, the calculation steps of a and b can be simplified as:
Figure BDA0004073606750000092
b=(1-a)·f mean (Y) formula (13)
Then, the formula for the pilot filter to calculate the output image Q using the coefficient ab at this time is:
Q=f mean (a)·I+f mean (b) Formula (14)
It can be seen from formulas (12) to (14) that for the output image Q, the larger epsilon the effect of the mean filtering is larger and the smaller epsilon the effect of the guidance map I is larger, so that the details to be preserved can be controlled by adjusting epsilon. For I-q, more high frequency details are preserved for larger epsilon and the graph is output for smaller epsilonMore regions like q are similar to the guide map I, with fewer I-f's in I-q mean (I) Sufficiently large high frequency details are preserved. As shown in fig. 4 (a) and 4 (b), when epsilon is small, texture information on the right white raw tower remains less, and high-frequency noise of the left sky is retained. Fig. 4 (a) shows a luminance enhanced image in which ε=0.001, λ=5; fig. 4 (b) shows a luminance enhancement image with epsilon=0.5 and lambda=3.
In the above embodiment, the server may calculate an accurate filter coefficient by solving the objective function, and further filter the initial luminance image.
In one embodiment, obtaining a brightness enhancement image according to the brightness enhancement value and the color image to be processed includes: amplifying the color image to be processed to obtain an amplified image; and adding each pixel in the amplified image with the brightness enhancement value to obtain a brightness enhancement image.
Alternatively, the server may select any up-sampling mode to amplify the color image to be processed, such as bilinear interpolation, bicubic interpolation, lanzos interpolation, etc., to obtain an amplified image.
Specifically, the server adds the brightness enhancement values corresponding to each pixel in the amplified image to obtain the final required brightness enhancement image, and specifically, the brightness enhancement image can be calculated according to the formula (1).
For example, a pixel of the enlarged image is [100,120,140], and the brightness enhancement value corresponding to the pixel is 10, and the pixel is [110,130,150] after brightness enhancement. It should be noted that, the brightness enhancement value is single-channel, so each pixel point corresponds to only one value, that is, the same brightness enhancement value is added to each channel of each pixel point in the amplified image.
In an exemplary embodiment, an image enhancement method is provided, which specifically includes the following steps:
(1) The initial luminance map Y is calculated by the formula (2).
(2) And (3) calculating according to formulas (4) - (13) to obtain the filter coefficient.
(3) And calculating corresponding filtered pixel values by using side window filters in different directions for coefficients a and b at each position of Y, and selecting a target brightness value from the filtered pixel values. In this embodiment, according to the retention condition of the edge contour information in the brightness enhancement image, a filtered pixel point with the maximum or minimum absolute value of the brightness of the current pixel point is selected as the target brightness value, so as to obtain the target brightness map.
(4) And (3) calculating the initial brightness map and the target brightness map through a formula (3) to obtain a brightness enhancement value.
(5) And up-sampling the color image to be processed, and then applying the brightness enhancement value to each pixel point in the color image to be processed to obtain the brightness enhancement image.
In this embodiment, by configuring the selection of the sharpening coefficients λ, ε and the absolute difference of the side window filter, the content and intensity of the texture information in the brightness enhancement value opY can be controlled, and the degree of freedom in controlling the content and intensity of the texture information is high.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiments of the present application also provide an image enhancement apparatus for implementing the above-mentioned image enhancement method. The implementation of the solution provided by the apparatus is similar to the implementation described in the above method, so the specific limitation in the embodiments of the image enhancement apparatus provided in the following may be referred to the limitation of the image enhancement method hereinabove, and will not be repeated here.
In one embodiment, as shown in fig. 5, there is provided an image enhancement apparatus comprising: an acquisition module 100, an extraction module 200, a calculation module 300, and an enhancement module 400, wherein:
an acquisition module 100 is configured to acquire a color image to be processed.
The extracting module 200 is configured to perform luminance extraction on the color image to be processed, so as to obtain an initial luminance map.
The calculating module 300 is configured to calculate based on the initial luminance map, and obtain a luminance enhancement value.
The enhancement module 400 is configured to obtain a brightness enhancement image according to the brightness enhancement value and the color image to be processed.
In one embodiment, the extracting module 200 includes:
and the pixel coefficient unit is used for acquiring the pixel extraction coefficient.
And the brightness extraction unit is used for obtaining an initial brightness image according to the pixel extraction coefficient and the color image to be processed.
In one embodiment, the enhancement module 400 includes:
and the filtering unit is used for filtering based on the initial brightness map and the filtering coefficient to obtain a target brightness map.
And the pixel enhancement unit is used for obtaining a brightness enhancement value according to the target brightness map and the initial brightness map.
In one embodiment, the filtering unit includes:
and the difference making subunit is used for making difference between the target brightness map and the initial brightness map to obtain a difference map.
And the sharpening subunit is used for obtaining a brightness enhancement value according to the difference value diagram and the sharpening coefficient.
In one embodiment, the pixel enhancement unit includes:
and the construction subunit is used for carrying out different-direction filtering on each pixel in the initial brightness map according to the filtering coefficient to obtain a filtering pixel value corresponding to each pixel.
And the target brightness subunit is used for combining the brightness value of each pixel and the filtered pixel value to obtain a target brightness map.
In one embodiment, the extracting module 200 further includes:
and the guiding unit is used for transforming the initial brightness map to obtain a guiding output map.
And the calculating unit is used for obtaining an objective function according to the guide output graph and the initial brightness value.
And the solving unit is used for solving the objective function to obtain a filter coefficient.
In one embodiment, the enhancement module 400 includes:
and the amplifying unit is used for amplifying the color image to be processed to obtain an amplified image.
And the pixel processing unit is used for adding each pixel in the amplified image with the brightness enhancement value to obtain a brightness enhancement image.
The respective modules in the above-described image enhancement apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 6. The computer device includes a processor, a memory, an Input/Output interface (I/O) and a communication interface. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used for storing color image data to be processed. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an image enhancement method.
It will be appreciated by those skilled in the art that the structure shown in fig. 6 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, a computer device is provided, comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method of any of the embodiments described above when the computer program is executed.
In an embodiment, a computer readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method of any of the embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method of any of the embodiments described above.
It should be noted that, the user information (including, but not limited to, user equipment information, user personal information, etc.) and the data (including, but not limited to, data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data are required to comply with the related laws and regulations and standards of the related countries and regions.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (11)

1. A method of image enhancement, the method comprising:
acquiring a color image to be processed;
performing brightness extraction on the color image to be processed to obtain an initial brightness map;
calculating based on the initial brightness map to obtain a brightness enhancement value;
and obtaining a brightness enhancement image according to the brightness enhancement value and the color image to be processed.
2. The method according to claim 1, wherein the performing brightness extraction on the color image to be processed to obtain an initial brightness map includes:
acquiring a pixel extraction coefficient;
and obtaining the initial brightness map according to the pixel extraction coefficient and the color image to be processed.
3. The method of claim 1, wherein said calculating based on said initial luminance map results in a luminance enhancement value, comprising:
filtering based on the initial brightness map and the filter coefficient to obtain a target brightness map;
and obtaining the brightness enhancement value according to the target brightness map and the initial brightness map.
4. A method according to claim 3, wherein said obtaining said luminance enhancement value from said target luminance map and said initial luminance map comprises:
performing difference making on the target brightness map and the initial brightness map to obtain a difference map;
and obtaining the brightness enhancement value according to the difference value diagram and the sharpening coefficient.
5. A method according to claim 3, wherein said filtering based on said initial luminance map and filter coefficients to obtain a target luminance map comprises:
according to the filter coefficient, filtering each pixel in the initial brightness map in different directions to obtain a filter pixel value corresponding to each pixel;
and combining the brightness value of each pixel and the filtered pixel value to obtain the target brightness map.
6. A method according to claim 3, wherein the filter coefficients are calculated by a method comprising:
transforming according to the initial brightness map to obtain a guide output map;
obtaining an objective function according to the guide output graph and the initial brightness value;
and solving the objective function to obtain the filter coefficient.
7. The method according to claim 1, wherein obtaining the brightness enhancement image from the brightness enhancement value and the color image to be processed comprises:
amplifying the color image to be processed to obtain an amplified image;
and adding each pixel in the amplified image and the brightness enhancement value to obtain the brightness enhancement image.
8. An image enhancement device, the device comprising:
the acquisition module is used for acquiring the color image to be processed;
the extraction module is used for extracting the brightness of the color image to be processed to obtain an initial brightness image;
the calculation module is used for calculating based on the initial brightness map to obtain a brightness enhancement value;
and the enhancement module is used for obtaining a brightness enhancement image according to the brightness enhancement value and the color image to be processed.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
11. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202310065902.2A 2023-01-13 2023-01-13 Image enhancement method, device, computer equipment and storage medium Pending CN116258644A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310065902.2A CN116258644A (en) 2023-01-13 2023-01-13 Image enhancement method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310065902.2A CN116258644A (en) 2023-01-13 2023-01-13 Image enhancement method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116258644A true CN116258644A (en) 2023-06-13

Family

ID=86680331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310065902.2A Pending CN116258644A (en) 2023-01-13 2023-01-13 Image enhancement method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116258644A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130321700A1 (en) * 2012-05-31 2013-12-05 Apple Inc. Systems and Methods for Luma Sharpening
CN104796682A (en) * 2015-04-22 2015-07-22 福州瑞芯微电子有限公司 Image signal color enhancement method and image signal color enhancement device
CN109255758A (en) * 2018-07-13 2019-01-22 杭州电子科技大学 Image enchancing method based on full 1*1 convolutional neural networks
CN110909686A (en) * 2019-11-26 2020-03-24 黑龙江大学 Low-illumination image enhancement system for driving assistance
CN111383178A (en) * 2018-12-29 2020-07-07 Tcl集团股份有限公司 Image enhancement method and device and terminal equipment
CN111626962A (en) * 2020-05-27 2020-09-04 重庆邮电大学 CMOS endoscope image enhancement method
CN112116536A (en) * 2020-08-24 2020-12-22 山东师范大学 Low-illumination image enhancement method and system
CN114140344A (en) * 2021-11-05 2022-03-04 浙江科技学院 Image enhancement method and system for low-illumination image
US20220292658A1 (en) * 2019-10-21 2022-09-15 Zhejiang Uniview Technologies Co., Ltd. Image fusion method and apparatus, storage medium, and electronic device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130321700A1 (en) * 2012-05-31 2013-12-05 Apple Inc. Systems and Methods for Luma Sharpening
CN104796682A (en) * 2015-04-22 2015-07-22 福州瑞芯微电子有限公司 Image signal color enhancement method and image signal color enhancement device
CN109255758A (en) * 2018-07-13 2019-01-22 杭州电子科技大学 Image enchancing method based on full 1*1 convolutional neural networks
CN111383178A (en) * 2018-12-29 2020-07-07 Tcl集团股份有限公司 Image enhancement method and device and terminal equipment
US20220292658A1 (en) * 2019-10-21 2022-09-15 Zhejiang Uniview Technologies Co., Ltd. Image fusion method and apparatus, storage medium, and electronic device
CN110909686A (en) * 2019-11-26 2020-03-24 黑龙江大学 Low-illumination image enhancement system for driving assistance
CN111626962A (en) * 2020-05-27 2020-09-04 重庆邮电大学 CMOS endoscope image enhancement method
CN112116536A (en) * 2020-08-24 2020-12-22 山东师范大学 Low-illumination image enhancement method and system
CN114140344A (en) * 2021-11-05 2022-03-04 浙江科技学院 Image enhancement method and system for low-illumination image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
唐宁;赵鹏;吴绍启;: "改进多尺度Retinex的彩色图像增强", 电子设计工程, no. 12, 20 June 2016 (2016-06-20) *
童莹;: "同态滤波耦合后处理优化的图像增强算法", 包装工程, no. 15, 10 August 2018 (2018-08-10) *
胡艳华;唐新来;崔亚楠;: "直方图分类耦合模糊逻辑的图像增强算法", 计算机工程与设计, no. 12, 16 December 2016 (2016-12-16) *

Similar Documents

Publication Publication Date Title
DE112018002228B4 (en) CONFIGURABLE CONVOLUTION ENGINE FOR NESTING CHANNEL DATA
EP1395041B1 (en) Colour correction of images
US20150363912A1 (en) Rgbw demosaic method by combining rgb chrominance with w luminance
US20110243428A1 (en) Bi-Affinity Filter: A Bilateral Type Filter for Color Images
US20080240602A1 (en) Edge mapping incorporating panchromatic pixels
US8164662B2 (en) Image-processing device for color image data and method for the image processing of color image data
EP3186954B1 (en) Image processing apparatus, image processing method, recording medium, and program
US11388355B2 (en) Multispectral image processing system and method
CN110211057B (en) Image processing method and device based on full convolution network and computer equipment
JP2022130642A (en) Adaptive Bilateral (BL) Filtering for Computer Vision
CN112801904B (en) Hybrid degraded image enhancement method based on convolutional neural network
CN111539893A (en) Bayer image joint demosaicing denoising method based on guided filtering
CN113112424A (en) Image processing method, image processing device, computer equipment and storage medium
CN111260580A (en) Image denoising method based on image pyramid, computer device and computer readable storage medium
DE112021006769T5 (en) CIRCUIT FOR COMBINED DOWNCLOCKING AND CORRECTION OF IMAGE DATA
CN111008936A (en) Multispectral image panchromatic sharpening method
WO2024055458A1 (en) Image noise reduction processing method and apparatus, device, storage medium, and program product
CN113454687A (en) Image processing method, apparatus and system, computer readable storage medium
CN111563866A (en) Multi-source remote sensing image fusion method
CN116258644A (en) Image enhancement method, device, computer equipment and storage medium
CN113674154B (en) Single image super-resolution reconstruction method and system based on generation countermeasure network
CN114638761A (en) Hyperspectral image panchromatic sharpening method, device and medium
WO2021139380A1 (en) Image processing method and device, electronic device
US20220215505A1 (en) System and method of processing of a captured image to facilitate post-processing modification
DE112021002288T5 (en) CONTENT-BASED IMAGE PROCESSING

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination