CN112887693B - Image purple border elimination method, equipment and storage medium - Google Patents

Image purple border elimination method, equipment and storage medium Download PDF

Info

Publication number
CN112887693B
CN112887693B CN202110036152.7A CN202110036152A CN112887693B CN 112887693 B CN112887693 B CN 112887693B CN 202110036152 A CN202110036152 A CN 202110036152A CN 112887693 B CN112887693 B CN 112887693B
Authority
CN
China
Prior art keywords
image
purple
pixel
value
fringed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110036152.7A
Other languages
Chinese (zh)
Other versions
CN112887693A (en
Inventor
刘硕
俞克强
王松
刘晓沐
李骏
艾成汉
陈虹宇
董振昊
王雨彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202110036152.7A priority Critical patent/CN112887693B/en
Publication of CN112887693A publication Critical patent/CN112887693A/en
Application granted granted Critical
Publication of CN112887693B publication Critical patent/CN112887693B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters

Abstract

The application discloses an image purple fringing elimination method, equipment and a storage medium. The image purple boundary eliminating method comprises the following steps: acquiring a target image; extracting the chrominance information of the target image, and extracting an initial purple-fringed image by using the chrominance information; obtaining a plurality of binary images through threshold segmentation by using the information of the target image and the initial purple fringed image, and combining the plurality of binary images to obtain a purple fringed pixel marking image; carrying out object edge color detection on the purple fringed pixel mark image, and removing the object color in the purple fringed pixel mark image according to the detection result to obtain a corrected purple fringed pixel mark image; multiplying the corrected purple fringing pixel marking image with the initial purple fringing image to obtain a corrected purple fringing image; and carrying out purple fringing correction processing on the target image by utilizing the corrected purple fringing image to obtain a purple fringing corrected image. The purple fringing detecting method and device can effectively solve the problems of missing detection and misjudgment of purple fringing, achieves the effect that purple fringing is completely removed, makes transition natural, and improves the visual effect of images.

Description

Image purple border elimination method, equipment and storage medium
Technical Field
The application belongs to the technical field of image processing, and particularly relates to a method and equipment for eliminating purple fringing of an image and a storage medium.
Background
Imperfections in the imaging device often result in some form of color noise or distortion. Chromatic aberration is a phenomenon in which a lens cannot optically focus light of various wavelengths on the same point, and is a defect associated with a lens. When a digital camera shoots, the brightness contrast of a shot scene is large, and obvious purple boundary performance appears at the junction of a high light area and a low light area. The purple fringing degree is related to a shooting scene, a camera lens, a sensor and an internal interpolation algorithm, cannot be completely avoided, and greatly influences the image quality and the visual effect of an imaging system.
The occurrence of purple fringing of the image can be reduced to a certain extent by changing the design of the lens, but the method is high in cost and cannot completely inhibit the occurrence of purple fringing. In addition, the purple fringing of an image is different due to the complexity of the lens quality and the shooting scene, and a wide purple fringing of several tens of pixels may occur and appear in a non-highlight region. Therefore, a method for effectively eliminating purple fringing of an image is needed.
Disclosure of Invention
The application provides an image purple boundary eliminating method, equipment and a storage medium, which are used for solving the problem of difficulty in eliminating image purple boundaries.
In order to solve the technical problem, the application adopts a technical scheme that: an image purple fringing elimination method, the method comprising: acquiring a target image, wherein the target image is a color image containing purple fringing; extracting the chrominance information of the target image, and extracting an initial purple-fringed image by using the chrominance information; obtaining a plurality of binary images through threshold segmentation by utilizing the information of the target image and the initial purple fringed image, and combining the plurality of binary images to obtain a purple fringed pixel mark image; carrying out object edge color detection on the purple fringed pixel label image, and removing the object color in the purple fringed pixel label image according to the detection result to obtain a corrected purple fringed pixel label image; multiplying the corrected purple fringing pixel marking image with the initial purple fringing image to obtain a corrected purple fringing image; and carrying out purple fringing correction processing on the target image by utilizing the corrected purple fringing image to obtain a purple fringing corrected image.
According to an embodiment of the present application, the extracting chrominance information of the target image includes: acquiring a red intensity value, a green intensity value and a blue intensity value of each pixel point of the target image; acquiring first chroma information, wherein the first chroma information is a difference value or a ratio of the red intensity value and the green intensity value, and if the first chroma information is a negative number, marking the first chroma information as 0; and acquiring second chroma information, wherein the second chroma information is the difference or ratio of the blue intensity value and the green intensity value, and if the second chroma information is a negative number, marking the second chroma information as 0.
According to an embodiment of the present application, the extracting chrominance information of the target image includes: converting the target image into a brightness and chrominance separation space, and obtaining a red intensity value, a blue intensity value and a luminance signal of each pixel point of the target image; acquiring first chroma information, wherein the first chroma information is a difference value between the red intensity value and the brightness signal, and if the first chroma information is a negative number, the first chroma information is marked as 0; and acquiring second chrominance information, wherein the second chrominance information is the difference value between the blue intensity value and the brightness signal, and if the second chrominance information is a negative number, recording the second chrominance information as 0.
According to an embodiment of the present application, the extracting an initial purple-fringing image by using the chrominance information includes: calculating a red channel pixel value of each pixel point, wherein the red channel pixel value is the smaller value of the first chrominance information and a first product, and the first product is the product of the second chrominance information and a preset maximum proportion; calculating a blue channel pixel value of each pixel point, wherein the blue channel pixel value is the smaller value of the second chrominance information and a first ratio, and the first ratio is the ratio of the first chrominance information and a preset minimum proportion; and calculating a green channel pixel value of each pixel point, wherein the green channel pixel value is set to be 0.
According to an embodiment of the present application, obtaining a plurality of binary images by threshold segmentation using information of the target image and the initial purple-fringed image, and combining the plurality of binary images to obtain a purple-fringed pixel label image includes: converting the target image into a gray-scale image, calculating the variance of each pixel point in the gray-scale image, comparing the variance with a first threshold value, recording the pixel points which are greater than or equal to the first threshold value as 1, and recording the pixel points which are smaller than the first threshold value as 0 to obtain a first binary image; adding pixel values of all channels in the initial purple fringing image to convert the pixel values into a single-channel image, calculating the variance of each pixel point in the single-channel image, comparing the variance with the second threshold value, recording the pixel points which are greater than or equal to the second threshold value as 1, and recording the pixel points which are smaller than the second threshold value as 0 to obtain a second binary image; calculating histogram information of the gray-scale image, comparing the histogram information with a third threshold value, taking pixel points with intensity values larger than or equal to the third threshold value as a highlight area, performing morphological expansion on the highlight area to obtain an expanded area, and subtracting the highlight area from the expanded area to obtain a high-contrast area, wherein the high-contrast area is a third binary image; and merging the first binary image, the second binary image and the third binary image to obtain a purple-edge pixel marker image.
According to an embodiment of the present application, the detecting the object edge color of the target image includes: calculating difference values in a horizontal direction, a vertical direction, a main diagonal direction and an auxiliary diagonal direction in a first preset size window by taking each marked pixel point in the purple border pixel marked image as a center, wherein the main diagonal direction is vertical to the auxiliary diagonal direction; selecting the direction with the largest difference value as the largest difference direction, and selecting the first non-marked pixel and the second non-marked pixel which are closest to each other in the largest difference direction; calculating the summation average value of all non-mark pixels in a second preset size window respectively centering on a first non-mark pixel and a second non-mark pixel on the initial purple fringing image; calculating the absolute value of the difference value between the summation average value and the corresponding marking pixel point to obtain an object edge detection value of the marking pixel point; and comparing the object edge detection value with a fourth threshold value, and taking the color of the marking pixel point as the object color in response to the object edge detection value being smaller than the fourth threshold value.
According to an embodiment of the present application, before said multiplying said modified purple fringing pixel labeling image with said initial purple fringing image, it comprises: and carrying out smoothing treatment on the corrected purple fringing pixel marking image.
According to an embodiment of the present application, the performing purple fringing correction processing on the target image by using the corrected purple fringed image to obtain a purple fringed corrected image includes: subtracting the corrected purple fringing image from the target image to obtain a purple fringing-eliminated corrected image; or, converting to a bright-chroma separation space, keeping the brightness of the target image unchanged according to the modified purple fringing image, modifying the chroma of the target image, and obtaining the purple fringing eliminated corrected image.
In order to solve the above technical problem, the present application adopts another technical solution: an electronic device comprising a memory and a processor coupled to each other, the processor being configured to execute program instructions stored in the memory to implement any of the above methods.
In order to solve the above technical problem, the present application adopts another technical solution: a computer readable storage medium having stored thereon program data which, when executed by a processor, implements any of the methods described above.
The beneficial effect of this application is: according to the method, the initial purple-fringed image is extracted globally by utilizing the chrominance information, the variance information of the target and the initial purple-fringed image is calculated, a high-light high-contrast area is searched, and the purple-fringed pixel marking image covering the edge of the real purple-fringed and blue-purple object is obtained. The purple fringing pixel marking image is utilized to detect the edge color of an object so as to correct the purple fringing pixel marking image, the corrected purple fringing pixel marking image is used for correcting the initial purple fringing image to obtain an accurate corrected purple fringing image, and finally the corrected purple fringing image is utilized to carry out purple fringing correction processing on a target image to obtain a purple fringing correction image, so that the problems of missing detection and misjudgment of the purple fringing are solved, the purple fringing is completely removed, the transition is natural, and the visual effect of the image is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required for the description of the embodiments will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without inventive efforts, wherein:
FIG. 1 is a flowchart illustrating an embodiment of an image purple boundary elimination method according to the present application;
FIG. 2 is a schematic diagram illustrating a direction of calculating a difference value in an embodiment of an image purple fringing elimination method according to the present application;
FIG. 3 is a diagram illustrating a method for searching for unmarked pixel locations in an embodiment of an image purple fringing reduction method according to the present application;
FIG. 4 is a schematic diagram of a framework of an embodiment of an image purple fringing removing apparatus of the present application;
FIG. 5 is a block diagram of an embodiment of an electronic device of the present application;
FIG. 6 is a block diagram of an embodiment of a computer-readable storage medium of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1 to 3, fig. 1 is a schematic flow chart of an embodiment of an image purple fringing elimination method according to the present application; FIG. 2 is a schematic diagram illustrating a direction of calculating a difference value in an embodiment of an image purple fringing elimination method according to the present application; fig. 3 is a schematic diagram of a method for searching a non-labeled pixel position in an embodiment of an image purple fringing elimination method according to the present application.
An embodiment of the present application provides an image purple boundary elimination method, including the following steps:
s101: and acquiring a target image, wherein the target image is a color image containing purple fringing.
And acquiring a target image, wherein the target image is a color image containing purple fringing.
S102: and extracting the chrominance information of the target image, and extracting the initial purple-fringed image by using the chrominance information.
Extracting the chrominance information of the target image, namely acquiring the chrominance information of each pixel point of the target image, wherein the following method can be adopted:
in one embodiment, the red, green, and blue intensity values R, G, and B of each pixel of the target image may be obtained. And obtaining first chromaticity information U, wherein the first chromaticity information U is a difference value R-G between the red intensity value R and the green intensity value G, and if the first chromaticity information U is a negative number, marking the first chromaticity information as 0. And obtaining second chromaticity information V, wherein the second chromaticity information V is the difference B-G between the blue intensity value and the green intensity value, and if the second chromaticity information V is a negative number, recording the second chromaticity information as 0.
In other embodiments, the first chrominance information U may also be a ratio R/G of the red intensity value R to the green intensity value G, and similarly, if the first chrominance information is negative, the first chrominance information is marked as 0. The second chromaticity information V is a ratio B/G of a blue intensity value to a green intensity value, and similarly, if the second chromaticity information V is negative, the second chromaticity information is written as 0.
In other embodiments, the target image may be converted into a luminance and chrominance separation space YUV, and a red intensity value R, a blue intensity value B, and a luminance signal Y of each pixel point of the target image are obtained. And obtaining first chrominance information U, wherein the first chrominance information U is a difference value R-Y between the red intensity value R and the brightness signal Y, and if the first chrominance information U is a negative number, recording the first chrominance information U as 0. The second chrominance information V is the difference B-Y between the blue intensity value B and the luminance signal Y, and is recorded as 0 if the second chrominance information V is negative.
After obtaining the chrominance information of the target image, the method also needs to extract an initial purple-fringed image by using the chrominance information, and specifically comprises the following steps:
and calculating a red channel pixel value r _ diff of each pixel point, wherein the red channel pixel value r _ diff is the smaller value of the first chrominance information U and the first product, and the first product is the product of the second chrominance information V and a preset maximum proportion max _ rb. Namely R _ diff = min (U, V × max _ rb), the red channel overflow value of the pixel point is used as the red channel pixel value, wherein the preset maximum ratio is the set maximum R/B ratio, and the value is determined by a large number of experiments and can also be adjusted according to the actual situation, which is not limited herein.
And calculating a blue channel pixel value b _ diff of each pixel point, wherein the blue channel pixel value b _ diff is the smaller value of the second chrominance information V and a first ratio, and the first ratio is the ratio of the first chrominance information U and a preset minimum ratio min _ rb. Namely B _ diff = min (V, U/min _ rb), the blue channel overflow value of the pixel point is used as the blue channel pixel value, wherein the preset minimum ratio is the set minimum R/B ratio, and the value is determined by a large number of experiments and can be adjusted according to the actual situation, which is not limited here.
And calculating the green channel pixel value of each pixel point, wherein the green channel pixel value is set to be 0.
Each pixel point in the initial purple fringing image comprises a red channel pixel value, a blue channel pixel value and a green channel pixel value.
S103: and obtaining a plurality of binary images through threshold segmentation by using the information of the target image and the initial purple fringed image, and combining the plurality of binary images to obtain a purple fringed pixel mark image.
And obtaining a plurality of binary images by threshold segmentation by using the information of the target image and the initial purple fringe image, wherein pixel points meeting the threshold are marked as 1 and serve as marking pixels, and pixel points not meeting the threshold are marked as 0 and serve as non-marking pixels. And combining the plurality of binary images to obtain a purple border pixel mark image.
Specifically, the target image is converted into a gray-scale image, the variance of each pixel point in the gray-scale image is calculated, the variance is compared with a first threshold value, the pixel points which are greater than or equal to the first threshold value are marked as 1, the pixel points which are smaller than the first threshold value are marked as 0, and a first binary image is obtained, wherein the pixel points with the binary result of 1 are marked pixels and possibly purple edges. When the variance of each pixel point in the gray level image is counted, the gray level image can be subjected to low-pass filtering processing to smooth the image and filter noise in the image.
And adding the pixel values of all channels in the initial purple fringing image to convert the initial purple fringing image into a single-channel image, calculating the variance of each pixel point in the single-channel image, comparing the variance with a second threshold value, recording the pixel points which are greater than or equal to the second threshold value as 1, recording the pixel points which are smaller than the second threshold value as 0, and obtaining a second binary image, wherein the pixel points with the binary result of 1 are marking pixels and possibly purple fringing.
Calculating histogram information of the gray-scale image, comparing the histogram information with a third threshold, taking pixel points with intensity values larger than or equal to the third threshold as a highlight area, performing morphological expansion on the highlight area to obtain an expanded area, and subtracting the highlight area from the expanded area to obtain a high-contrast area, wherein the high-contrast area is a third binary image. The high contrast area may be purple fringing, the pixel points of the high contrast area are marked as 1, and the other pixel points are 0.
The purple-edge pixel marking image covering the edges of the real purple edges and the blue-purple objects is obtained by combining the first binary image, the second binary image and the third binary image, and the condition of missing detection is not easy to occur. In this embodiment, three binary images are combined to obtain a purple fringed pixel marker image, in other embodiments, two binary images may be combined to obtain a purple fringed pixel marker image, or a binary image of a suspected purple fringed is obtained and combined by other methods, so that missing detection is avoided.
S104: and detecting the edge color of the purple fringing pixel marking image, and removing the object color in the purple fringing pixel marking image according to the detection result to obtain a corrected purple fringing pixel marking image.
In an embodiment, performing object edge color detection on a purple fringed pixel label image, removing an object color in the purple fringed pixel label image according to a detection result, and obtaining a corrected purple fringed pixel label image, includes: referring to fig. 2, with each marked pixel point X in the purple-edge pixel marked image as a center, difference values in a horizontal direction, a vertical direction, a main diagonal direction and an auxiliary diagonal direction are calculated in a first preset size window, and the main diagonal direction and the auxiliary diagonal direction are perpendicular to each other. The first preset size window is a 5 × 5 window, and in other embodiments, the first preset size window may also be a 3 × 3 window, a 7 × 7 window, or the like; the major diagonal direction is typically the top left to bottom right diagonal and the minor diagonal is the bottom left to top right diagonal. The difference value is calculated by referring to fig. 2, the difference value is the absolute value of the sum of the vertical line pixels minus the sum of the horizontal line pixels, the upper left corner in fig. 2 is the horizontal direction, the upper right corner is the vertical direction, the lower left corner is the main diagonal direction, and the lower right corner is the auxiliary diagonal direction.
Referring to fig. 3, the color-filled pixels in fig. 3 are purple-fringed pixels, and the blank pixels are non-marked pixels. Selecting the direction with the largest difference value as the maximum difference direction, and searching the first non-marking pixel X with the closest distance to the marking pixel point X in the maximum difference direction 1 And a second unmarked pixel X 2
First non-mark pixel X 1 And a second unmarked pixel X 2 In the purple fringed pixel marking image, calculating on the initial purple fringed image to correspond to the first non-marking pixel X 1 And a second non-marked pixel X 2 The pixel point of (a) is the center, and the sum average of all non-marked pixels in a second preset size window. In this embodiment, the second preset size window is a 3 × 3 window, and in other embodiments, the second preset size window may also be a 5 × 5 window, a 7 × 7 window, or the like.
And calculating the difference value between the summation average value and the corresponding marking pixel point, taking the absolute value of the difference value, and taking the absolute value as the object edge detection value of the marking pixel point.
And comparing the object edge detection value with a fourth threshold value, and taking the color of the marking pixel point as the object color when the object edge detection value is smaller than the fourth threshold value. And removing the object color in the purple fringed pixel mark image according to the detection result, setting the mark pixel point mark which is the object color to be 0, and obtaining a corrected purple fringed pixel mark image.
In the present embodiment, the direction determination is performed on the purple fringed pixel label image, and the color difference determination is performed on the initial purple fringed image, but the direction determination is not limited to the purple fringed pixel label image, and the color difference determination may be performed on the target image, and is not limited herein.
The purple-fringed pixel labeled image is utilized to detect the color of the edge of the object, and the purple-fringed pixel labeled image is corrected according to the detection result, so that the anti-interference capability on noise can be enhanced, and the problem of misjudgment of the edge of the blue-purple object is greatly reduced.
S105: and multiplying the corrected purple fringing pixel marking image and the initial purple fringing image to obtain a corrected purple fringing image.
And multiplying the corrected purple fringing pixel marking image by the initial purple fringing image, namely multiplying the pixel value of each pixel point, and finally taking the pixel point with the product of 1 as the purple fringing to finally obtain the corrected purple fringing image.
Before the corrected purple-fringed pixel label image is multiplied by the initial purple-fringed image, the corrected purple-fringed pixel label image can be subjected to smoothing treatment, for example, low-pass filtering and the like is adopted, so that the corrected purple-fringed pixel label image becomes smooth, noise in the corrected purple-fringed pixel label image is filtered, the initial purple-fringed image is corrected, and purple-fringed pixels and peripheral non-purple-fringed pixels are more natural in transition.
S106: and carrying out purple fringing correction processing on the target image by utilizing the corrected purple fringing image to obtain a purple fringing corrected image.
In one embodiment, the purple fringing correction processing of the target image by using the corrected purple fringing image includes: and subtracting the corrected purple fringing image from the target image to obtain a purple fringing corrected image after the purple fringing is eliminated.
In other embodiments, performing the purple fringing correction processing on the target image by using the modified purple fringing image may further include: and converting the image into a brightness and chrominance separation space YUV, keeping the brightness of the target image unchanged according to the corrected purple fringing image, and modifying the chrominance of the target image to obtain a purple fringing corrected image after the purple fringing is eliminated.
The corrected purple fringing image is used for carrying out purple fringing correction processing on the target image, pixels with serious purple fringing can be gray, the purple fringing degree is light, the color of the object can be reserved to a certain degree in a correction result, and the simple desaturation is not carried out, so that the visual effect is more natural.
According to the method, the initial purple-fringed image is extracted globally by utilizing the chrominance information, the variance information of the target and the initial purple-fringed image is calculated, a high-light high-contrast area is searched, and the purple-fringed pixel marking image covering the edge of the real purple-fringed and blue-purple object is obtained. The purple fringing pixel marking image is utilized to detect the color of the edge of an object so as to correct the purple fringing pixel marking image, the corrected purple fringing pixel marking image is used to correct the initial purple fringing image to obtain an accurate corrected purple fringing image, and finally the corrected purple fringing image is utilized to carry out purple fringing correction processing on a target image to obtain a purple fringing correction image, so that the problems of missing detection and misjudgment of purple fringing are solved, the clean removal of purple fringing is realized, the transition is natural, and the visual effect of the image is improved.
Referring to fig. 4, fig. 4 is a schematic frame diagram of an embodiment of an image purple fringing elimination apparatus according to the present application.
The present application further provides an image purple fringe eliminating apparatus 40, which includes an obtaining module 41 and a processing module 42, so as to implement the image purple fringe eliminating method of the corresponding embodiment. Specifically, the acquisition module 41 acquires a target image, which is a color image containing purple fringing. The processing module 42 extracts the chrominance information of the target image, and extracts an initial purple-fringing image by using the chrominance information; the processing module 42 obtains a plurality of binary images through threshold segmentation by using the information of the target image and the initial purple fringed image, and combines the plurality of binary images to obtain a purple fringed pixel marking image; the processing module 42 performs object edge color detection on the purple fringed pixel label image, removes the object color in the purple fringed pixel label image according to the detection result, and obtains a modified purple fringed pixel label image; the processing module 42 multiplies the corrected purple fringing pixel marking image with the initial purple fringing image to obtain a corrected purple fringing image; the processing module 42 performs purple fringing correction processing on the target image by using the corrected purple fringing image to obtain a purple fringing corrected image. The device extracts an initial purple-fringed image globally by utilizing chrominance information, calculates variance information of a target and the initial purple-fringed image, and searches a high-light high-contrast area to obtain a purple-fringed pixel marking image covering the edge of a real purple-fringed object and a blue-purple object. The purple fringing pixel marking image is utilized to detect the edge color of an object so as to correct the purple fringing pixel marking image, the corrected purple fringing pixel marking image is used for correcting the initial purple fringing image to obtain an accurate corrected purple fringing image, and finally the corrected purple fringing image is utilized to carry out purple fringing correction processing on a target image to obtain a purple fringing correction image, so that the problems of missing detection and misjudgment of the purple fringing are solved, the purple fringing is completely removed, the transition is natural, and the visual effect of the image is improved.
Referring to fig. 5, another embodiment of the present application provides an electronic device 50, which includes a memory 51 and a processor 52 coupled to each other, where the processor 52 is configured to execute program instructions stored in the memory 51 to implement the image purple fringing elimination method according to any of the above embodiments. In one particular implementation scenario, electronic device 50 may include, but is not limited to: a microcomputer, a server, and the electronic device 50 may also include a mobile device such as a notebook computer, a tablet computer, and the like, which is not limited herein.
In particular, the processor 52 is configured to control itself and the memory 51 to implement the encoding based on the string encoding technique of any of the above embodiments. Processor 52 may also be referred to as a CPU (Central Processing Unit). Processor 52 may be an integrated circuit chip having signal processing capabilities. The Processor 52 may also be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processor 52 may be commonly implemented by an integrated circuit chip.
Referring to fig. 6, another embodiment of the present application provides a computer-readable storage medium 60, on which program data 61 is stored, and when the program data 61 is executed by a processor, the method for eliminating purple fringing in an image according to any of the embodiments is implemented.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely one type of logical division, and an actual implementation may have another division, for example, a unit or a component may be combined or integrated with another system, or some features may be omitted, or not implemented. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some interfaces, indirect coupling or communication connection between devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on network elements. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium 60. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium 60 and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method of the embodiments of the present application. And the aforementioned storage medium 60 includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only an example of the present application, and is not intended to limit the scope of the present application, and all equivalent structures or equivalent processes performed by the present application and the contents of the attached drawings, which are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (9)

1. An image purple fringing elimination method, characterized in that the method comprises:
acquiring a target image, wherein the target image is a color image containing purple fringing;
extracting the chrominance information of the target image, and extracting an initial purple-fringed image by using the chrominance information;
obtaining a plurality of binary images through threshold segmentation by utilizing the information of the target image and the initial purple fringed image, and combining the plurality of binary images to obtain a purple fringed pixel mark image; the obtaining a plurality of binary images through threshold segmentation by using the information of the target image and the initial purple fringed image, and combining the plurality of binary images to obtain a purple fringed pixel mark image includes:
converting the target image into a gray-scale image, calculating the variance of each pixel point in the gray-scale image, comparing the variance with a first threshold value, recording the pixel points which are greater than or equal to the first threshold value as 1, and recording the pixel points which are smaller than the first threshold value as 0 to obtain a first binary image;
adding pixel values of all channels in the initial purple fringed image to convert the pixel values into a single-channel image, calculating the variance of each pixel point in the single-channel image, comparing the variance with a second threshold value, recording the pixel points which are greater than or equal to the second threshold value as 1, and recording the pixel points which are smaller than the second threshold value as 0 to obtain a second binary image;
calculating histogram information of the gray scale image, comparing the histogram information with a third threshold, taking pixel points with intensity values larger than or equal to the third threshold as a highlight area, performing morphological expansion on the highlight area to obtain an expanded area, and subtracting the highlight area from the expanded area to obtain a high-contrast area, wherein the high-contrast area is a third binary image;
merging the first binary image, the second binary image and the third binary image to obtain a purple-edge pixel marker image;
carrying out object edge color detection on the purple fringed pixel label image, and removing the object color in the purple fringed pixel label image according to the detection result to obtain a corrected purple fringed pixel label image;
multiplying the corrected purple fringing pixel marking image with the initial purple fringing image to obtain a corrected purple fringing image;
and carrying out purple fringing correction processing on the target image by utilizing the corrected purple fringing image to obtain a purple fringing corrected image.
2. The method of claim 1, wherein the extracting chrominance information of the target image comprises:
acquiring a red intensity value, a green intensity value and a blue intensity value of each pixel point of the target image;
acquiring first chroma information, wherein the first chroma information is a difference value or a ratio of the red intensity value and the green intensity value, and if the first chroma information is a negative number, marking the first chroma information as 0;
and acquiring second chromaticity information, wherein the second chromaticity information is a difference value or a ratio of the blue intensity value and the green intensity value, and if the second chromaticity information is a negative number, recording the second chromaticity information as 0.
3. The method of claim 1, wherein the extracting chrominance information of the target image comprises:
converting the target image into a brightness and chrominance separation space, and obtaining a red intensity value, a blue intensity value and a luminance signal of each pixel point of the target image;
acquiring first chroma information, wherein the first chroma information is a difference value between the red intensity value and the brightness signal, and if the first chroma information is a negative number, the first chroma information is marked as 0;
and acquiring second chrominance information, wherein the second chrominance information is the difference value between the blue intensity value and the brightness signal, and if the second chrominance information is a negative number, recording the second chrominance information as 0.
4. The method as claimed in claim 2 or 3, wherein said extracting an initial purple-fringed image using said chrominance information comprises:
calculating a red channel pixel value of each pixel point, wherein the red channel pixel value is the smaller value of the first chrominance information and a first product, and the first product is the product of the second chrominance information and a preset maximum proportion;
calculating a blue channel pixel value of each pixel point, wherein the blue channel pixel value is the smaller value of the second chrominance information and a first ratio, and the first ratio is the ratio of the first chrominance information and a preset minimum proportion;
and calculating a green channel pixel value of each pixel point, wherein the green channel pixel value is set to be 0.
5. The method of claim 1, wherein the performing object edge color detection on the target image comprises:
calculating difference values in a horizontal direction, a vertical direction, a main diagonal direction and an auxiliary diagonal direction in a first preset size window by taking each marked pixel point in the purple border pixel marked image as a center, wherein the main diagonal direction is vertical to the auxiliary diagonal direction;
selecting the direction with the largest difference value as the largest difference direction, and selecting the first non-marked pixel and the second non-marked pixel which are closest to each other in the largest difference direction;
calculating a summation average value of all non-mark pixels in a second preset size window respectively centering on a first non-mark pixel and a second non-mark pixel on the initial purple fringing image;
calculating the absolute value of the difference value between the summation average value and the corresponding marking pixel point to obtain an object edge detection value of the marking pixel point;
and comparing the object edge detection value with a fourth threshold value, and taking the color of the marking pixel point as the object color in response to the object edge detection value being smaller than the fourth threshold value.
6. The method as recited in claim 1, prior to said multiplying said modified purple fringed pixel label image with said initial purple fringed image comprising:
and carrying out smoothing treatment on the corrected purple fringing pixel marking image.
7. The method according to claim 1, wherein performing purple fringing correction processing on the target image by using the modified purple fringed image to obtain a purple fringed image comprises:
subtracting the corrected purple fringing image from the target image to obtain a purple fringing eliminated corrected image; alternatively, the first and second electrodes may be,
and converting the image into a bright-chroma separation space, keeping the brightness of the target image unchanged according to the corrected purple fringing image, and correcting the chroma of the target image to obtain the purple fringing corrected image without the purple fringing.
8. An electronic device comprising a memory and a processor coupled to each other, the processor being configured to execute program instructions stored in the memory to implement the method of any of claims 1 to 7.
9. A computer-readable storage medium, on which program data are stored, which program data, when being executed by a processor, carry out the method of any one of claims 1 to 7.
CN202110036152.7A 2021-01-12 2021-01-12 Image purple border elimination method, equipment and storage medium Active CN112887693B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110036152.7A CN112887693B (en) 2021-01-12 2021-01-12 Image purple border elimination method, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110036152.7A CN112887693B (en) 2021-01-12 2021-01-12 Image purple border elimination method, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112887693A CN112887693A (en) 2021-06-01
CN112887693B true CN112887693B (en) 2023-04-18

Family

ID=76044240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110036152.7A Active CN112887693B (en) 2021-01-12 2021-01-12 Image purple border elimination method, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112887693B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113379778A (en) * 2021-06-04 2021-09-10 大连海事大学 Image purple boundary detection method based on content self-adaptive threshold
CN113393540B (en) * 2021-06-10 2023-10-27 爱芯元智半导体(宁波)有限公司 Method and device for determining color edge pixel points in image and computer equipment
CN113784101A (en) * 2021-09-26 2021-12-10 三星半导体(中国)研究开发有限公司 Purple fringing correction method and purple fringing correction device
US11863916B2 (en) * 2022-01-27 2024-01-02 Altek Semiconductor Corporation Color correction method and image correction apparatus
CN114511461A (en) * 2022-02-10 2022-05-17 上海闻泰信息技术有限公司 Image processing method, electronic device, and computer-readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111047619A (en) * 2018-10-11 2020-04-21 展讯通信(上海)有限公司 Face image processing method and device and readable storage medium
CN111199524A (en) * 2019-12-26 2020-05-26 浙江大学 Purple edge correction method for image of adjustable aperture optical system
CN111340734A (en) * 2020-03-02 2020-06-26 浙江大学 Image purple boundary correction method using convolutional neural network model
CN111353960A (en) * 2020-03-02 2020-06-30 浙江大学 Image purple boundary correction method based on region growing and cross channel information

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4539299B2 (en) * 2004-11-08 2010-09-08 ソニー株式会社 Image processing apparatus, image processing method, and computer program
KR101160956B1 (en) * 2009-11-30 2012-06-29 서강대학교산학협력단 Method and system for correcting purple fringing
CN106251298B (en) * 2016-07-22 2020-03-31 华为技术有限公司 Method and apparatus for processing image
CN107864365B (en) * 2017-10-31 2020-03-31 上海集成电路研发中心有限公司 Method for eliminating purple border of image
CN108961295B (en) * 2018-07-27 2022-01-28 重庆师范大学 Purple soil image segmentation and extraction method based on normal distribution H threshold

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111047619A (en) * 2018-10-11 2020-04-21 展讯通信(上海)有限公司 Face image processing method and device and readable storage medium
CN111199524A (en) * 2019-12-26 2020-05-26 浙江大学 Purple edge correction method for image of adjustable aperture optical system
CN111340734A (en) * 2020-03-02 2020-06-26 浙江大学 Image purple boundary correction method using convolutional neural network model
CN111353960A (en) * 2020-03-02 2020-06-30 浙江大学 Image purple boundary correction method based on region growing and cross channel information

Also Published As

Publication number Publication date
CN112887693A (en) 2021-06-01

Similar Documents

Publication Publication Date Title
CN112887693B (en) Image purple border elimination method, equipment and storage medium
US7916937B2 (en) Image processing device having color shift correcting function, image processing program and electronic camera
JP4054184B2 (en) Defective pixel correction device
US8509533B2 (en) Image processing device and image processing method
US8405780B1 (en) Generating a clean reference image
US8582878B1 (en) Purple fringing automatic detection and correction
US20060251322A1 (en) Color fringe desaturation for electronic imagers
IES20050822A2 (en) Foreground/background segmentation in digital images with differential exposure calculations
JP2012176641A (en) Detection apparatus for parking frame
EP3100449A1 (en) Method for conversion of a saturated image into a non-saturated image
CN111161188B (en) Method for reducing image color noise, computer device and readable storage medium
WO2019076326A1 (en) Shadow detection method and system for surveillance video image, and shadow removing method
JP5640622B2 (en) Method for classifying red-eye object candidates, computer-readable medium, and image processing apparatus
US8482630B2 (en) Apparatus and method for adjusting automatic white balance by detecting effective area
US9860456B1 (en) Bayer-clear image fusion for dual camera
JP2003123063A (en) Image processor
SE1550394A1 (en) Segmentation based image transform
CN111414877B (en) Table cutting method for removing color frame, image processing apparatus and storage medium
JP2003224859A (en) Image processing apparatus, image processing method, and program
CN114511461A (en) Image processing method, electronic device, and computer-readable storage medium
CN112184588A (en) Image enhancement system and method for fault detection
CN108133204B (en) Hand body identification method, device, equipment and computer readable storage medium
CN108810320B (en) Image quality improving method and device
CN114581344B (en) Purple edge correction method for video image
KR102315200B1 (en) Image processing apparatus for auto white balance and processing method therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant