CN111383178A - Image enhancement method and device and terminal equipment - Google Patents
Image enhancement method and device and terminal equipment Download PDFInfo
- Publication number
- CN111383178A CN111383178A CN201811619860.8A CN201811619860A CN111383178A CN 111383178 A CN111383178 A CN 111383178A CN 201811619860 A CN201811619860 A CN 201811619860A CN 111383178 A CN111383178 A CN 111383178A
- Authority
- CN
- China
- Prior art keywords
- image
- processed
- pixel point
- brightness value
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000001914 filtration Methods 0.000 claims abstract description 67
- 230000002146 bilateral effect Effects 0.000 claims abstract description 29
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 22
- 238000012545 processing Methods 0.000 claims abstract description 21
- 238000004590 computer program Methods 0.000 claims description 19
- 238000004364 calculation method Methods 0.000 claims description 8
- 230000000694 effects Effects 0.000 abstract description 9
- 230000006870 function Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 230000004044 response Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000010606 normalization Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/10—Image enhancement or restoration using non-spatial domain filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20028—Bilateral filtering
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The application is applicable to the technical field of image processing, and provides an image enhancement method, an image enhancement device and terminal equipment, wherein the method comprises the following steps: acquiring an image to be processed, and processing the image to be processed through a bilateral filtering algorithm to obtain a filtering brightness value of a pixel point in the image to be processed; and calculating the target brightness value of the pixel point according to the original brightness value and the filtering brightness value of the pixel point. The method and the device can solve the problems that the existing image method is easy to generate wrong object edges and the image enhancement effect is poor.
Description
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image enhancement method, an image enhancement device, and a terminal device.
Background
Image enhancement is an image processing method, which can purposefully emphasize the overall or local characteristics of an image, make an original unclear image clear or emphasize some interesting features, enlarge the difference between different object features in the image, inhibit the uninteresting features, improve the image quality and enhance the image interpretation and recognition effects.
The current main image enhancement method is to enhance the image by using a gaussian filter, but when the image enhancement is performed by using the gaussian filter, the false object edge is easily generated, and the image enhancement effect is not good.
In conclusion, the existing image method is easy to generate wrong object edges, and the image enhancement effect is poor.
Disclosure of Invention
In view of this, embodiments of the present application provide an image enhancement method, an image enhancement device, and a terminal device, so as to solve the problems that an object edge is prone to be generated incorrectly by an existing image method, and an image enhancement effect is poor.
A first aspect of an embodiment of the present application provides an image enhancement method, including:
acquiring an image to be processed, and processing the image to be processed through a bilateral filtering algorithm to obtain a filtering brightness value of a pixel point in the image to be processed;
and calculating the target brightness value of the pixel point according to the original brightness value and the filtering brightness value of the pixel point.
A second aspect of an embodiment of the present application provides an image enhancement apparatus, including:
the bilateral filtering module is used for acquiring an image to be processed, and processing the image to be processed through a bilateral filtering algorithm to obtain a filtering brightness value of a pixel point in the image to be processed;
and the brightness calculation module is used for calculating the target brightness value of the pixel point according to the original brightness value and the filtering brightness value of the pixel point.
A third aspect of the embodiments of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the method when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, in which a computer program is stored, which, when executed by a processor, implements the steps of the method as described above.
Compared with the prior art, the embodiment of the application has the advantages that:
according to the image enhancement method, the image to be processed is processed through the bilateral filtering algorithm, the filtering brightness value of the pixel point is obtained, the target brightness value of the pixel point is calculated according to the original brightness value of the pixel point and the filtering brightness value, the original brightness value and the filtering brightness value are used for calculating the target brightness value, the image can be enhanced under the condition that no error edge is generated, and the problems that an error object edge is easily generated and the image enhancement effect is poor in the existing image method are solved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart of an implementation of an image enhancement method provided in an embodiment of the present application;
fig. 2 is a schematic diagram of an image enhancement apparatus provided in an embodiment of the present application;
fig. 3 is a schematic diagram of a terminal device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
The first embodiment is as follows:
referring to fig. 1, an image enhancement method according to a first embodiment of the present application is described below, where the image enhancement method according to the first embodiment of the present application includes:
s101, acquiring an image to be processed, and processing the image to be processed through a bilateral filtering algorithm to obtain a filtering brightness value of a pixel point in the image to be processed;
when the image to be processed needs to be enhanced, firstly, the image to be processed is processed by using a bilateral filtering algorithm to obtain a filtering brightness value of the image to be processed, the filtering brightness value loses part of high-frequency information compared with an original brightness value, and the bilateral filtering algorithm can be represented by the following formula:
wherein, (x, y) is the pixel coordinate of the first pixel, (x-I, y-j) is the pixel coordinate of the second pixel, I is the difference value of the horizontal coordinate between the first pixel and the second pixel, j is the difference value of the vertical coordinate between the first pixel and the second pixel, I represents the original brightness value of the pixel, and2 diffб being the square of the absolute value of the difference in luminance between the original luminance value of the first pixel point and the second pixel pointcAnd бdThe parameter is a control parameter of the bilateral filter algorithm, m and n are size parameters of the bilateral filter algorithm, A (x, y) is a filter brightness value of the first pixel point which is not subjected to normalization processing, B (x, y) is a normalization weight of the first pixel point, and F (x, y) is a filter brightness value of the first pixel point after normalization processing.
And S102, calculating a target brightness value of the pixel point according to the original brightness value and the filtering brightness value of the pixel point.
After the filtering brightness value of the pixel point is obtained through calculation, the target brightness value of the pixel point can be calculated according to the original brightness value and the filtering brightness value of the pixel point, and the image is enhanced under the condition that no error edge is generated.
Further, the calculating the target brightness value of the pixel point according to the original brightness value and the filter brightness value of the pixel point specifically includes:
a1, subtracting the corresponding filtering brightness value from the original brightness value of the pixel point to obtain the brightness difference value of the pixel point;
subtracting the corresponding filtering brightness value from the original brightness value of the pixel point to obtain a brightness difference value, wherein the brightness difference value is part of high-frequency information representing the original brightness value, and an expression for calculating the brightness difference value is as follows:
H(x,y)=I(x,y)-F(x,y)
wherein, H (x, y) represents the luminance difference of the first pixel.
And A2, multiplying the brightness difference value of the pixel point by a preset coefficient, and then adding the multiplied brightness difference value to the corresponding original brightness value to obtain the target brightness value of the pixel point.
Multiplying the brightness difference value of the pixel point by a preset coefficient, and then adding the multiplied brightness difference value to the corresponding original brightness value, so that part of high-frequency components of the original brightness value of the pixel point can be enhanced, and a target brightness value is obtained, wherein the expression for calculating the target brightness value is as follows:
Ienhanced(x,y)=I(x,y)+αH(x,y)
wherein, Ienhanced(x, y) is the target brightness value of the first pixel point, and α is a preset coefficient.
And when the brightness value of each pixel point in the image to be processed is updated to the corresponding target brightness value, obtaining the target image after image enhancement.
Further, the obtaining of the image to be processed and processing the image to be processed through a bilateral filtering algorithm to obtain the filtering brightness value of the pixel point in the image to be processed specifically includes:
b1, acquiring an image to be processed;
b2, establishing a coordinate system in the image to be processed to obtain pixel coordinates of pixel points in the image to be processed;
after the image to be processed is obtained, a coordinate system needs to be established in the image to be processed, so that the pixel coordinates of each pixel point are determined, and the processing of the bilateral filtering algorithm is facilitated.
And B3, processing the pixel coordinate and the original brightness value of the pixel point through a bilateral filtering algorithm to obtain a filtering brightness value of the pixel point.
And acquiring the pixel coordinates and the original brightness values of all the pixel points, substituting the pixel coordinates and the original brightness values of all the pixel points into a bilateral filtering algorithm for calculation, and calculating the filtering brightness values of all the pixel points respectively.
Further, the establishing a coordinate system in the image to be processed to obtain pixel coordinates of pixels in the image to be processed specifically includes:
and C1, establishing a coordinate system by taking the top left corner vertex of the image to be processed as a coordinate origin, taking the first edge connected with the coordinate origin as an X axis and taking the second edge connected with the coordinate origin as a Y axis, and obtaining the pixel coordinates of the pixel points in the image to be processed.
When the coordinate system is established, the vertex at the upper left corner of the image to be processed may be used as the origin of coordinates, the first edge connected to the origin of coordinates may be used as the X-axis, and the second edge connected to the origin of coordinates may be used as the Y-axis, for example, the horizontal edge connected to the origin of coordinates may be used as the X-axis, and the vertical edge connected to the origin of coordinates may be used as the Y-axis, so as to establish the coordinate system.
The positive direction of the X axis and the positive direction of the Y axis in the coordinate system can be set according to actual conditions.
And after the coordinate system is established, the pixel coordinates of the pixel points in the image to be processed can be obtained.
Further, the acquiring the image to be processed further comprises:
d1, judging whether the color coding format of the image to be processed is a YUV format;
in this embodiment, the brightness value of each pixel point is adjusted to achieve an image enhancement effect, so that when the to-be-processed image is obtained, it is necessary to determine whether the color coding format of the to-be-processed image is the YUV format.
D2, when the color coding format of the image to be processed is a non-YUV format, converting the color coding format of the image to be processed into a YUV format.
And when the color coding format of the image to be processed is a non-YUV format, performing format conversion on the image to be processed, and converting the color coding format of the image to be processed into a YUV format.
In the image enhancement method provided by this embodiment, a to-be-processed image is processed through a bilateral filtering algorithm to obtain a pixel point filtering brightness value, a target brightness value of the pixel point is calculated according to an original brightness value of the pixel point and the filtering brightness value, and the image can be enhanced by calculating the target brightness value using the original brightness value and the filtering brightness value without generating an error edge, so that the problems that an error object edge is easily generated and an image enhancement effect is poor in the conventional image method are solved.
When the filtering brightness value of the pixel point is obtained, the filtering brightness value is subtracted from the original brightness value of the pixel point to obtain a brightness difference value, the brightness difference value represents part of high-frequency information of the original brightness value of the pixel point, the brightness difference value of the pixel point is multiplied by a preset coefficient and then is added with the corresponding original brightness value to obtain a target brightness value of the pixel point, and part of high-frequency information of the pixel point can be enhanced under the condition that an error edge is not generated.
After the image to be processed is obtained, a coordinate system needs to be established in the image to be processed to obtain pixel coordinates of each pixel point so as to perform bilateral filtering processing, and when the coordinates are established, the upper left corner of the image to be processed can be used as a coordinate origin, and two sides connected with the coordinate origin are used as an X-axis and a Y-axis to establish the coordinate system.
In addition, the color coding format of the image to be processed can be detected, and if the color coding format of the image to be processed is a non-YUV format, the color coding format of the image to be processed is converted into a YUV format.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Example two:
the second embodiment of the present application provides an image enhancement apparatus, which is only shown in relevant parts for convenience of description, and as shown in fig. 2, the image enhancement apparatus includes,
the bilateral filtering module 201 is configured to obtain an image to be processed, and process the image to be processed through a bilateral filtering algorithm to obtain a filtering brightness value of a pixel point in the image to be processed;
and the brightness calculation module 202 is configured to calculate a target brightness value of the pixel point according to the original brightness value and the filter brightness value of the pixel point.
Further, the brightness calculation module 202 specifically includes:
the difference submodule is used for subtracting the corresponding filtering brightness value from the original brightness value of the pixel point to obtain the brightness difference value of the pixel point;
and the target submodule is used for multiplying the brightness difference value of the pixel point by a preset coefficient and then adding the multiplied brightness difference value and the corresponding original brightness value to obtain a target brightness value of the pixel point.
Further, the bilateral filtering module 201 specifically includes:
the acquisition submodule is used for acquiring an image to be processed;
the coordinate submodule is used for establishing a coordinate system in the image to be processed to obtain pixel coordinates of pixel points in the image to be processed;
and the filtering submodule is used for processing through a bilateral filtering algorithm according to the pixel coordinates and the original brightness values of the pixel points to obtain the filtering brightness values of the pixel points.
Further, the coordinate submodule is specifically configured to use an upper left corner vertex of the image to be processed as a coordinate origin, use a first edge connected to the coordinate origin as an X-axis, and use a second edge connected to the coordinate origin as a Y-axis, and establish a coordinate system to obtain pixel coordinates of pixel points in the image to be processed.
Further, the bilateral filtering module further comprises:
the detection submodule is used for judging whether the color coding format of the image to be processed is a YUV format;
and the format submodule is used for converting the color coding format of the image to be processed into a YUV format when the color coding format of the image to be processed is a non-YUV format.
Further, the difference submodule includes the following expression:
H(x,y)=I(x,y)-F(x,y)
h (x, y) is the brightness difference value of the pixel point, I (x, y) is the original brightness value of the pixel point, and F (x, y) is the filtering brightness value of the pixel point.
Further, the target sub-module includes the following expression:
Ienhanced(x,y)=I(x,y)+αH(x,y)
wherein, Ienhanced(x, y) is the target brightness value of the pixel point, and α is a preset coefficient.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Example three:
fig. 3 is a schematic diagram of a terminal device provided in the third embodiment of the present application. As shown in fig. 3, the terminal device 3 of this embodiment includes: a processor 30, a memory 31 and a computer program 32 stored in said memory 31 and executable on said processor 30. The processor 30, when executing the computer program 32, implements the steps in the above-described embodiment of the image enhancement method, such as the steps S101 to S102 shown in fig. 1. Alternatively, the processor 30, when executing the computer program 32, implements the functions of each module/unit in the above-mentioned device embodiments, for example, the functions of the modules 201 to 202 shown in fig. 2.
Illustratively, the computer program 32 may be partitioned into one or more modules/units that are stored in the memory 31 and executed by the processor 30 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 32 in the terminal device 3. For example, the computer program 32 may be divided into a bilateral filtering module and a brightness calculation module, and each module has the following specific functions:
the bilateral filtering module is used for acquiring an image to be processed, and processing the image to be processed through a bilateral filtering algorithm to obtain a filtering brightness value of a pixel point in the image to be processed;
and the brightness calculation module is used for calculating the target brightness value of the pixel point according to the original brightness value and the filtering brightness value of the pixel point.
The terminal device 3 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 30, a memory 31. It will be understood by those skilled in the art that fig. 3 is only an example of the terminal device 3, and does not constitute a limitation to the terminal device 3, and may include more or less components than those shown, or combine some components, or different components, for example, the terminal device may also include an input-output device, a network access device, a bus, etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may be an internal storage unit of the terminal device 3, such as a hard disk or a memory of the terminal device 3. The memory 31 may also be an external storage device of the terminal device 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the terminal device 3. The memory 31 is used for storing the computer program and other programs and data required by the terminal device. The memory 31 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (10)
1. An image enhancement method, comprising:
acquiring an image to be processed, and processing the image to be processed through a bilateral filtering algorithm to obtain a filtering brightness value of a pixel point in the image to be processed;
and calculating the target brightness value of the pixel point according to the original brightness value and the filtering brightness value of the pixel point.
2. The image enhancement method according to claim 1, wherein the calculating the target luminance value of the pixel point according to the original luminance value and the filtered luminance value of the pixel point specifically comprises:
subtracting the corresponding filtering brightness value from the original brightness value of the pixel point to obtain a brightness difference value of the pixel point;
and multiplying the brightness difference value of the pixel point by a preset coefficient, and then adding the multiplied brightness difference value to the corresponding original brightness value to obtain a target brightness value of the pixel point.
3. The image enhancement method according to claim 1, wherein the obtaining of the image to be processed and the processing of the image to be processed by the bilateral filtering algorithm to obtain the filtering brightness value of the pixel point in the image to be processed specifically comprises:
acquiring an image to be processed;
establishing a coordinate system in the image to be processed to obtain pixel coordinates of pixel points in the image to be processed;
and processing the pixel coordinates and the original brightness values of the pixel points through a bilateral filtering algorithm to obtain the filtering brightness values of the pixel points.
4. The image enhancement method according to claim 3, wherein the establishing of the coordinate system in the image to be processed to obtain the pixel coordinates of the pixel points in the image to be processed specifically comprises:
and establishing a coordinate system by taking the top left corner vertex of the image to be processed as a coordinate origin, taking a first edge connected with the coordinate origin as an X axis and taking a second edge connected with the coordinate origin as a Y axis to obtain pixel coordinates of pixel points in the image to be processed.
5. The image enhancement method according to claim 3 or 4, further comprising, after the acquiring the image to be processed:
judging whether the color coding format of the image to be processed is a YUV format or not;
and when the color coding format of the image to be processed is a non-YUV format, converting the color coding format of the image to be processed into a YUV format.
6. The image enhancement method according to claim 2, wherein the expression of the luminance difference value of the pixel point obtained by subtracting the corresponding filtered luminance value from the original luminance value of the pixel point is:
H(x,y)=I(x,y)-F(x,y)
h (x, y) is the brightness difference value of the pixel point, I (x, y) is the original brightness value of the pixel point, and F (x, y) is the filtering brightness value of the pixel point.
7. The image enhancement method according to claim 6, wherein the expression of the target luminance value of the pixel point obtained by adding the luminance difference value of the pixel point multiplied by a preset coefficient to the corresponding original luminance value is:
Ienhanced(x,y)=I(x,y)+αH(x,y)
wherein, Ienhanced(x, y) is the target brightness value of the pixel point, and α is a preset coefficient.
8. An image enhancement apparatus, comprising:
the bilateral filtering module is used for acquiring an image to be processed, and processing the image to be processed through a bilateral filtering algorithm to obtain a filtering brightness value of a pixel point in the image to be processed;
and the brightness calculation module is used for calculating the target brightness value of the pixel point according to the original brightness value and the filtering brightness value of the pixel point.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811619860.8A CN111383178A (en) | 2018-12-29 | 2018-12-29 | Image enhancement method and device and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811619860.8A CN111383178A (en) | 2018-12-29 | 2018-12-29 | Image enhancement method and device and terminal equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111383178A true CN111383178A (en) | 2020-07-07 |
Family
ID=71221779
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811619860.8A Pending CN111383178A (en) | 2018-12-29 | 2018-12-29 | Image enhancement method and device and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111383178A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112258413A (en) * | 2020-10-23 | 2021-01-22 | 浙江大华技术股份有限公司 | Image enhancement method, electronic device and storage medium |
CN113298761A (en) * | 2021-05-07 | 2021-08-24 | 奥比中光科技集团股份有限公司 | Image filtering method, device, terminal and computer readable storage medium |
CN113744294A (en) * | 2021-08-09 | 2021-12-03 | 深圳曦华科技有限公司 | Image processing method and related device |
CN115511755A (en) * | 2022-11-22 | 2022-12-23 | 杭州雄迈集成电路技术股份有限公司 | Video stream image self-adaptive enhancement method and system |
CN116258644A (en) * | 2023-01-13 | 2023-06-13 | 格兰菲智能科技有限公司 | Image enhancement method, device, computer equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010010975A (en) * | 2008-06-25 | 2010-01-14 | Noritsu Koki Co Ltd | Photographic image processing method, photographic image processing program, and photographic image processing device |
CN103700067A (en) * | 2013-12-06 | 2014-04-02 | 浙江宇视科技有限公司 | Method and device for promoting image details |
US20150139566A1 (en) * | 2013-11-20 | 2015-05-21 | Ricoh Company, Ltd. | Image processing device and image processing method |
CN106683056A (en) * | 2016-12-16 | 2017-05-17 | 凯迈(洛阳)测控有限公司 | Airborne photoelectric infrared digital image processing method and apparatus thereof |
CN109785239A (en) * | 2017-11-13 | 2019-05-21 | 华为技术有限公司 | The method and apparatus of image procossing |
-
2018
- 2018-12-29 CN CN201811619860.8A patent/CN111383178A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010010975A (en) * | 2008-06-25 | 2010-01-14 | Noritsu Koki Co Ltd | Photographic image processing method, photographic image processing program, and photographic image processing device |
US20150139566A1 (en) * | 2013-11-20 | 2015-05-21 | Ricoh Company, Ltd. | Image processing device and image processing method |
CN103700067A (en) * | 2013-12-06 | 2014-04-02 | 浙江宇视科技有限公司 | Method and device for promoting image details |
CN106683056A (en) * | 2016-12-16 | 2017-05-17 | 凯迈(洛阳)测控有限公司 | Airborne photoelectric infrared digital image processing method and apparatus thereof |
CN109785239A (en) * | 2017-11-13 | 2019-05-21 | 华为技术有限公司 | The method and apparatus of image procossing |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112258413A (en) * | 2020-10-23 | 2021-01-22 | 浙江大华技术股份有限公司 | Image enhancement method, electronic device and storage medium |
CN113298761A (en) * | 2021-05-07 | 2021-08-24 | 奥比中光科技集团股份有限公司 | Image filtering method, device, terminal and computer readable storage medium |
CN113744294A (en) * | 2021-08-09 | 2021-12-03 | 深圳曦华科技有限公司 | Image processing method and related device |
CN113744294B (en) * | 2021-08-09 | 2023-12-19 | 深圳曦华科技有限公司 | Image processing method and related device |
CN115511755A (en) * | 2022-11-22 | 2022-12-23 | 杭州雄迈集成电路技术股份有限公司 | Video stream image self-adaptive enhancement method and system |
CN115511755B (en) * | 2022-11-22 | 2023-03-10 | 杭州雄迈集成电路技术股份有限公司 | Video stream image self-adaptive enhancement method and system |
CN116258644A (en) * | 2023-01-13 | 2023-06-13 | 格兰菲智能科技有限公司 | Image enhancement method, device, computer equipment and storage medium |
CN116258644B (en) * | 2023-01-13 | 2024-10-29 | 格兰菲智能科技股份有限公司 | Image enhancement method, device, computer equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111383178A (en) | Image enhancement method and device and terminal equipment | |
CN109191395B (en) | Image contrast enhancement method, device, equipment and storage medium | |
CN110473242B (en) | Texture feature extraction method, texture feature extraction device and terminal equipment | |
CN107358586B (en) | Image enhancement method, device and equipment | |
CN109146855B (en) | Image moire detection method, terminal device and storage medium | |
CN110874827B (en) | Turbulent image restoration method and device, terminal equipment and computer readable medium | |
CN109214996B (en) | Image processing method and device | |
CN109389560B (en) | Adaptive weighted filtering image noise reduction method and device and image processing equipment | |
CN109309826B (en) | Image color balancing method and device, terminal equipment and readable storage medium | |
CN110969046B (en) | Face recognition method, face recognition device and computer-readable storage medium | |
CN111861938B (en) | Image denoising method and device, electronic equipment and readable storage medium | |
CN114862897B (en) | Image background processing method and device and electronic equipment | |
CN113487473B (en) | Method and device for adding image watermark, electronic equipment and storage medium | |
CN111626967A (en) | Image enhancement method, image enhancement device, computer device and readable storage medium | |
CN113344801A (en) | Image enhancement method, system, terminal and storage medium applied to gas metering facility environment | |
CN111563517A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN111126248A (en) | Method and device for identifying shielded vehicle | |
CN109584165A (en) | A kind of antidote of digital picture, device, medium and electronic equipment | |
CN113744294A (en) | Image processing method and related device | |
CN111311610A (en) | Image segmentation method and terminal equipment | |
CN108629219B (en) | Method and device for identifying one-dimensional code | |
CN114596210A (en) | Noise estimation method, device, terminal equipment and computer readable storage medium | |
CN111160363B (en) | Method and device for generating feature descriptors, readable storage medium and terminal equipment | |
JP4104475B2 (en) | Contour correction device | |
CN110363723B (en) | Image processing method and device for improving image boundary effect |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200707 |