CN115661179A - Image purple edge detection method and device - Google Patents

Image purple edge detection method and device Download PDF

Info

Publication number
CN115661179A
CN115661179A CN202211189509.6A CN202211189509A CN115661179A CN 115661179 A CN115661179 A CN 115661179A CN 202211189509 A CN202211189509 A CN 202211189509A CN 115661179 A CN115661179 A CN 115661179A
Authority
CN
China
Prior art keywords
image
purple
determining
region
fringing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211189509.6A
Other languages
Chinese (zh)
Inventor
范玉龙
谭朗标
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Shenzhen Co ltd
Original Assignee
Spreadtrum Communications Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Shenzhen Co ltd filed Critical Spreadtrum Communications Shenzhen Co ltd
Priority to CN202211189509.6A priority Critical patent/CN115661179A/en
Publication of CN115661179A publication Critical patent/CN115661179A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention relates to the field of image processing, in particular to a method and a device for detecting purple edges of an image. The method comprises the steps of firstly obtaining a first image, then determining a first purple fringing candidate region of the first image according to the gray level change of the first image in a first color space, then determining a second purple fringing candidate region of the first image according to the color range of the first image in a second color space, and finally determining the overlapping region of the first purple fringing candidate region and the second purple fringing candidate region as the purple fringing region of the first image. The first purple fringing candidate area and the second purple fringing candidate area are determined through the gray scale change and the color range of the first image, and the final purple fringing area is determined according to the first purple fringing candidate area and the second purple fringing candidate area, so that the accuracy and the efficiency of image purple fringing detection are improved.

Description

Image purple edge detection method and device
Technical Field
The invention relates to the field of image processing, in particular to a method and a device for detecting purple edges of images.
Background
The purple fringing phenomenon of the image widely exists in images obtained by digital imaging systems such as current mobile phone cameras, digital cameras and monitoring cameras, when users use the equipment to shoot under the conditions of backlight, large aperture and the like, the purple fringing phenomenon can be observed easily in local areas of the obtained images, and the purple fringing problem of the image is solved, so that the imaging equipment can obtain the image with better visual effect.
At the present stage, the detection efficiency and the detection precision are difficult to be considered by the detection means of the purple border of the image, and when the detection result is accurate, the detection range is narrow and the efficiency is difficult to guarantee; when the detection range is wide, the detection result is rough, a large amount of false detection exists, the precision is difficult to guarantee, and the subsequent purple fringing correction processing is influenced.
Disclosure of Invention
The embodiment of the invention provides a method and a device for detecting purple fringing of an image, which are used for determining an area which accords with the purple fringing phenomenon in the image according to the pixel gray value of the image and simultaneously improving the detection efficiency and the detection precision.
In a first aspect, an embodiment of the present invention provides an image purple boundary detection method, including:
acquiring a first image;
determining a first purple fringing candidate area of the first image according to the gray level change of the first image in a first color space;
determining a second purple fringing candidate area of the first image according to the color range of the first image in a second color space;
determining a superposition area of the first purple fringing candidate area and the second purple fringing candidate area as a purple fringing area of the first image.
In one implementation, the determining a first purple fringing candidate region of the first image according to a gray level variation of the first image in a first color space includes:
dividing the first image into a plurality of image sub-regions;
determining a pixel gray scale difference value of each image subregion;
and determining the image sub-region with the pixel gray difference value larger than a threshold value as the first purple fringing candidate region.
In one implementation, the determining a pixel gray difference value of each of the image sub-regions includes:
acquiring the maximum gray value in each image subregion;
acquiring a minimum gray value in each image subregion;
and determining the difference value between the maximum gray value and the minimum gray value of each image sub-region as the pixel gray difference value corresponding to the image sub-region.
In one implementation, the determining, as the first purple boundary candidate region, an image sub-region in which the pixel gray difference value is greater than a threshold includes:
determining an adaptive threshold for each image sub-region;
for any image subregion, if the pixel gray level difference value of the image subregion is greater than the adaptive threshold value corresponding to the image subregion, the image subregion is determined as the first purple fringing candidate region.
In one implementation, the determining an adaptive threshold for each image sub-region includes:
for any image subregion, determining all image subregions contained in the first region range by taking the image subregion as a center as adjacent image subregions of the image subregion;
determining the weighted sum of the mean value and the standard deviation of the pixel gray level difference values of all the adjacent image subregions corresponding to any image subregion as the adaptive threshold value of the image subregion;
wherein the adaptive threshold is adjustable based on a weight of the mean or a weight of the standard deviation.
In one implementation, the determining a second purple boundary candidate region of the first image according to the color range of the first image in the second color space includes:
determining whether the color of each pixel point in the first image is in a preset color range according to the color value of each pixel point in the second color space;
and determining pixel points with colors within a preset color range as pixel points of the second purple fringing candidate area.
In one implementation, determining a coincidence region of the first purple boundary candidate region and the second purple boundary candidate region as a purple boundary region of the first image includes:
and for any pixel point in the first image, if the pixel point belongs to the first purple boundary candidate region and the second purple boundary candidate region at the same time, determining the pixel point as the pixel point of the purple boundary region.
In a second aspect, an embodiment of the present invention provides a video playback control apparatus, where the apparatus includes:
the acquisition module is used for acquiring a first image;
the determining module is used for determining a first purple fringing candidate area of the first image according to the gray level change of the first image in a first color space;
the determining module is further configured to determine a second purple fringing candidate region of the first image according to a color change of the first image in a second color space;
the determining module is further configured to determine a superposed region of the first purple fringing candidate region and the second purple fringing candidate region as a purple fringing region of the first image.
In a third aspect, an embodiment of the present invention provides an electronic device, including:
at least one processor; and
at least one memory communicatively coupled to the processor, wherein:
the memory stores program instructions executable by the processor, the processor being capable of executing the method provided by the first aspect when invoked by the processor.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where the computer-readable storage medium includes a stored program, where the program, when executed, controls an apparatus in which the computer-readable storage medium is located to perform the method provided in the first aspect.
In the embodiment of the invention, a first image is obtained, a first purple boundary candidate region of the first image is determined according to the gray scale change of the first image in a first color space, a second purple boundary candidate region of the first image is determined according to the color range of the first image in a second color space, and finally the overlapping region of the first purple boundary candidate region and the second purple boundary candidate region is determined as the purple boundary region of the first image. The first purple fringing candidate area and the second purple fringing candidate area are determined through the gray level change and the color range of the first image, and the final purple fringing area is determined according to the first purple fringing candidate area and the second purple fringing candidate area, so that the accuracy and the efficiency of image purple fringing detection are improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1 is a flowchart of an image purple boundary detection method according to an embodiment of the present invention;
fig. 2 is a flowchart of another image purple boundary detection method according to an embodiment of the present invention;
fig. 3 is a flowchart of another image purple boundary detection method according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an image purple fringe detection apparatus according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to better understand the technical solutions of the present specification, the following detailed description is made with reference to the accompanying drawings.
It should be understood that the described embodiments are only a few embodiments of the present specification, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present specification without any inventive step are within the scope of the present specification.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the specification. As used in the description of the invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Fig. 1 is a flowchart of an image purple boundary detection method according to an embodiment of the present invention. The method can be applied to terminal devices such as smart phones and tablet computers, and as shown in fig. 1, the method can include:
step 101, a first image is acquired.
The first image may be an image acquired by the terminal device at the time of shooting, or a picture stored in advance.
Step 102, determining a first purple fringing candidate area of the first image according to the gray level change of the first image in the first color space.
The first color space of the embodiment of the invention comprises an RGB color space, the brightness change in the first image can be represented by gray scale change in the RGB color space, and the first purple fringing candidate area comprises a brightness-dark boundary area in the first image. It can be understood that the purple boundary of the image usually appears at the bright-dark boundary of the image, and after the terminal device captures the image, the bright-dark boundary region (the first purple boundary candidate region) in the image can be determined through the gray scale change of the image in the RGB color space, and the purple boundary region can be further determined according to the bright-dark boundary region.
In some embodiments, when determining the first purple boundary candidate region of the first image, the terminal device may specifically include the following steps: dividing a first image into a plurality of image sub-regions; determining the pixel gray level difference value of each image subregion; and determining the image subarea with the pixel gray difference value larger than a threshold value as the first purple fringing candidate area. Each image sub-region may include a plurality of pixel points, and each pixel point corresponds to a pixel gray value in each single color channel in the RGB color space. For example, the gray value of a pixel point is: r (red) 80, G (green) 90, B (blue) 130. The different pixel gray values are represented as the luminance of the corresponding color. The terminal device determines the gray level change according to the pixel gray level difference value of each image subregion, and when the pixel gray level difference value of a certain image subregion is greater than a threshold value, the terminal device can be regarded as a bright-dark boundary region (a first purple fringing candidate region) of the image subregion, which belongs to the first image.
In some embodiments, the method for determining the pixel gray difference value of the image sub-region by the terminal device may specifically include: the method comprises the steps of firstly obtaining the maximum gray value in each image sub-region, then obtaining the minimum gray value in each image sub-region, and determining the difference value between the maximum gray value and the minimum gray value of each image sub-region as the pixel gray difference value corresponding to the image sub-region. When determining the maximum pixel value of the image sub-region, the terminal device mixes the pixel gray values of the multiple pixel points in the single color channel together for comparison, determines the maximum value of all the pixel gray values as the maximum pixel value of the corresponding image sub-region, and determines the minimum value of all the pixel gray values as the minimum pixel value of the corresponding image sub-region. For example, a certain image sub-region includes a pixel point a, a pixel point B and a pixel point C, the pixel gray-scale values of the pixel point a are R10, G60 and B140, the pixel gray-scale values of the pixel point B are R20, G30 and B90, and the pixel gray-scale values of the pixel point C are R50, G55 and B70. The image sub-region comprises 9 gray-level values of pixels, which are respectively 10, 60, 140, 20, 30, 90, 50, 55 and 70, wherein 140 is the maximum pixel value of the image sub-region, and 10 is the minimum pixel value of the image sub-region. The difference between 140 and 10 is taken and the pixel difference value for this sub-region of the image is 130.
In some embodiments, after determining the pixel gray level difference value of the image sub-region, the terminal device determines an adaptive threshold value of each image sub-region, and if the pixel gray level difference value of the image sub-region is greater than the adaptive threshold value of the image sub-region, determines the image sub-region as the first purple fringed candidate region. The method for determining the adaptive threshold of the image sub-region is not unique, and the embodiment of the invention provides the following method, which specifically comprises the following steps: for any image subregion, determining all image subregions contained in the first region range by taking the image subregion as a center as adjacent image subregions of the image subregion; and determining the weighted sum of the mean value and the standard deviation of the pixel gray difference values of all the adjacent image subregions corresponding to any image subregion as the adaptive threshold value of the image subregion. Optionally, the terminal device may set a weight for the mean value or/and the standard deviation, and adjust the adaptive threshold by adjusting the weight of the mean value or the weight of the standard deviation; the terminal device may also set the weight to 1, ignoring the effect of the weight on the adaptive threshold. In addition, when the adaptive threshold is determined, the weight can be reasonably set for the near image sub-region, and the adaptive threshold is obtained through weighting processing.
Step 103, determining a second purple fringing candidate area of the first image according to the color range of the first image in the second color space.
The second color space of the embodiment of the invention comprises a YCbCr color space, the color range is used for displaying the color of the first image, and the second purple boundary candidate area comprises an area with purple color in the first image. The purple edge of the image is usually expressed as a fixed color, such as purple, and after the terminal device captures the image, a purple region (a second purple edge candidate region) in the image can be determined through the color value of the image in the YCbCr color space, and the purple edge region is further determined according to the purple edge region.
In some embodiments, when the terminal device determines the second purple edge candidate region, the first image is converted into the YCbCr color space, then the color value of each pixel point in the first image in the YCbCr color space is determined, and the pixel point with the color in the preset color range is determined as the pixel point of the second purple edge candidate region. In the YCbCr color space, Y represents brightness and density of a color, and Cb and Cr represent a blue density shift amount and a red density shift amount of the color, respectively. And the color values corresponding to each pixel point are respectively a Y value, a Cb value and a Cr value, and when all 3 values are within a preset color range, the pixel point can be determined as the pixel point of the second purple fringed candidate area.
And 104, determining the overlapping area of the first purple fringing candidate area and the second purple fringing candidate area as the purple fringing area of the first image.
In some embodiments, if it is determined that a certain pixel belongs to both the first purple boundary candidate region and the second purple boundary candidate region, the terminal device determines the pixel as a pixel of the purple boundary region. The terminal device may screen out the first purple fringing candidate region first, and then determine a pixel point belonging to the second purple fringing candidate region in the first purple fringing candidate region as a pixel point of the purple fringing candidate region.
In the embodiment of the invention, the bright-dark boundary area in the first image can be accurately determined by determining the pixel gray difference value in the RGB color space; by determining the color value in the YCbCr color space, a specific color (purple) area in the first image can be efficiently determined, and an area meeting two conditions at the same time is determined as a purple edge area. The method improves the accuracy and efficiency of the image purple boundary detection.
Fig. 2 is a flowchart of another image purple boundary detection method according to an embodiment of the present invention. As shown in fig. 2, may include:
step 201, a first image is acquired.
The first image acquired by the terminal device is in the RGB color space by default, as shown at 301 in fig. 3. Before the first image is acquired, the terminal device may further determine specific parameters, such as the size of the image sub-region, the adaptive threshold weighting coefficient, and the like, according to the selection of the user.
Step 202, dividing the image sub-regions.
Step 203, the maximum pixel gray scale value is determined.
The terminal device determines the maximum pixel gray values contained in each image sub-region, and a gray map composed of the maximum pixel gray values of the respective image sub-regions may be as shown at 302 in fig. 3.
Step 204, the minimum pixel gray scale value is determined.
The terminal device determines the minimum pixel gray value contained in each image sub-region, and a gray map composed of the minimum pixel gray values of the respective image sub-regions may be as shown in 303 in fig. 3.
In step 205, a pixel gray level difference value is determined.
The terminal device may use the pixel gray scale difference values of each image sub-region to form a gray scale map, which is shown as 304 in fig. 3.
And step 206, performing adaptive threshold value binarization processing.
The terminal device obtains the adaptive threshold of each image sub-region, and performs binarization processing on the adaptive threshold, and the processed image may be as shown by 305 in fig. 3, where white is the first purple boundary candidate region of the first image.
Step 207, convert to YCbCr space.
And step 208, determining the color value of the pixel point.
And step 209, color value binarization.
The binarized image may be as shown at 306 in fig. 3, where white is the second purple fringing candidate region.
And step 210, taking the intersection to determine the purple boundary area.
White is a purple-fringed area as shown at 307 in fig. 3.
The purple boundary area of the first image can be accurately and efficiently determined through the steps.
Fig. 4 is a schematic structural diagram of an image purple fringe detection apparatus according to an embodiment of the present invention. As shown in fig. 4, the apparatus may include: an acquisition module 410 and a determination module 420.
An acquiring module 410 is configured to acquire a first image.
The determining module 420 is configured to determine a first purple fringing candidate region of the first image according to a gray level change of the first image in the first color space.
The determining module 420 is further configured to determine a second purple-fringing candidate region of the first image according to a color variation of the first image in the second color space.
The determining module 420 is further configured to determine a superposition area of the first purple-fringed candidate area and the second purple-fringed candidate area as a purple-fringed area of the first image.
The image purple boundary detection device can be used for realizing the image purple boundary detection method provided by the embodiment of the invention.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 5, the electronic device is in the form of a general purpose computing device. Components of the electronic device may include, but are not limited to: one or more processors 510, a memory 530, and a communication bus 540 that couples various system components including the memory 530 and the processors 510.
Communication bus 540 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. These architectures include, but are not limited to, industry Standard Architecture (ISA) bus, micro Channel Architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, to name a few. In this embodiment of the present invention, the electronic device may further include a camera data Interface, such as a Mobile Industry Processor Interface (MIPI), which specifically includes: the camera interface D-PHY, the camera interface C-PHY, or the camera interface M-PHY, etc., and the processor 510 may implement the image purple boundary detection method provided by the embodiment of the present invention by calling the interfaces.
Electronic devices typically include a variety of computer system readable media. Such media may be any available media that is accessible by the electronic device and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 530 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) and/or cache Memory. The electronic device may further include other removable/non-removable, volatile/nonvolatile computer system storage media. Although not shown in FIG. 5, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk such as a "floppy disk" or an optical disk drive for reading from or writing to a removable, nonvolatile optical disk such as a Compact disk Read Only Memory (CD-ROM), digital versatile disk (DVD-ROM), or other optical media may be provided. In these cases, each drive may be connected to the communication bus 540 by one or more data media interfaces. Memory 530 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility having a set (at least one) of program modules, including but not limited to an operating system, one or more application programs, other program modules, and program data, may be stored in memory 530, each of which examples or some combination may include an implementation of a network environment. The program modules generally perform the functions and/or methodologies of the described embodiments of the invention.
The electronic device may also communicate with one or more external devices, may also communicate with one or more devices that enable a user to interact with the electronic device, and/or may communicate with any device (e.g., network card, modem, etc.) that enables the electronic device to communicate with one or more other computing devices. Such communication may occur via communications interface 520. Furthermore, the electronic device may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public Network such as the Internet) via a Network adapter (not shown in FIG. 5) that may communicate with other modules of the electronic device via the communication bus 540. It should be appreciated that although not shown in FIG. 5, other hardware and/or software modules may be used in conjunction with the electronic device, including but not limited to: microcode, device drivers, redundant processing units, external disk drive Arrays, disk array (RAID) systems, tape Drives, and data backup storage systems, among others.
The processor 510 executes various functional applications and data processing, for example, implementing the image purple boundary detection method provided by the embodiment of the present invention, by executing programs stored in the memory 530.
The embodiment of the invention also provides a computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions, and the computer instructions enable the computer to execute the image purple fringe detection method provided by the embodiment of the invention.
The computer-readable storage medium described above may take any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable compact disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless explicitly specified otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some interfaces, indirect coupling or communication connection between devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that are within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. An image purple boundary detection method, characterized in that the method comprises:
acquiring a first image;
determining a first purple boundary candidate region of the first image according to the gray scale change of the first image in a first color space;
determining a second purple fringing candidate area of the first image according to the color range of the first image in a second color space;
determining a superposition area of the first purple boundary candidate area and the second purple boundary candidate area as a purple boundary area of the first image.
2. The method of claim 1, wherein determining the first purple-fringing candidate region of the first image according to the gray-level variation of the first image in the first color space comprises:
dividing the first image into a plurality of image sub-regions;
determining a pixel gray level difference value of each image subregion;
and determining the image subarea with the pixel gray difference value larger than a threshold value as the first purple fringing candidate area.
3. The method of claim 2, wherein said determining a pixel gray scale difference value for each of said image sub-regions comprises:
acquiring the maximum pixel gray value in each image sub-region;
acquiring a minimum pixel gray value in each image sub-region;
and determining the difference value between the maximum pixel gray value and the minimum pixel gray value of each image subregion as the pixel gray difference value corresponding to the image subregion.
4. The method of claim 2, wherein determining the image sub-region with the pixel gray difference value greater than the threshold as the first purple boundary candidate region comprises:
determining an adaptive threshold for each image sub-region;
for any image subregion, if the pixel gray level difference value of the image subregion is greater than the adaptive threshold value corresponding to the image subregion, the image subregion is determined as the first purple fringing candidate region.
5. The method of claim 4, wherein determining the adaptive threshold for each image sub-region comprises:
for any image subregion, determining all image subregions contained in the first region range by taking the image subregion as a center as an adjacent image subregion of the image subregion;
determining the weighted sum of the mean value and the standard deviation of the pixel gray level difference values of all the adjacent image subregions corresponding to any image subregion as the adaptive threshold value of the image subregion;
wherein the adaptive threshold is adjustable based on a weight of the mean or a weight of the standard deviation.
6. The method of claim 1, wherein determining the second purple boundary candidate region of the first image according to the color range of the first image in the second color space comprises:
determining whether the color of each pixel point in the first image is in a preset color range according to the color value of each pixel point in the second color space;
and determining pixel points with colors within a preset color range as pixel points of the second purple fringing candidate area.
7. The method according to claim 1, wherein determining a coincidence region of the first purple-fringed candidate region and the second purple-fringed candidate region as a purple-fringed region of the first image comprises:
and for any pixel point in the first image, if the pixel point belongs to the first purple boundary candidate region and the second purple boundary candidate region at the same time, determining the pixel point as the pixel point of the purple boundary region.
8. An image purple fringing detection apparatus, the apparatus comprising:
the acquisition module is used for acquiring a first image;
the determining module is used for determining a first purple fringing candidate area of the first image according to the gray level change of the first image in a first color space;
the determining module is further configured to determine a second purple fringing candidate region of the first image according to a color change of the first image in a second color space;
the determining module is further configured to determine a superposed region of the first purple fringing candidate region and the second purple fringing candidate region as a purple fringing region of the first image.
9. An electronic device, comprising:
at least one processor; and
at least one memory communicatively coupled to the processor, wherein:
the memory stores program instructions executable by the processor, the processor being capable of invoking the program instructions to perform the method of any of claims 1 to 7.
10. A computer-readable storage medium, comprising a stored program, wherein the program, when executed, controls an apparatus in which the computer-readable storage medium is located to perform the method of any of claims 1-7.
CN202211189509.6A 2022-09-28 2022-09-28 Image purple edge detection method and device Pending CN115661179A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211189509.6A CN115661179A (en) 2022-09-28 2022-09-28 Image purple edge detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211189509.6A CN115661179A (en) 2022-09-28 2022-09-28 Image purple edge detection method and device

Publications (1)

Publication Number Publication Date
CN115661179A true CN115661179A (en) 2023-01-31

Family

ID=84984687

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211189509.6A Pending CN115661179A (en) 2022-09-28 2022-09-28 Image purple edge detection method and device

Country Status (1)

Country Link
CN (1) CN115661179A (en)

Similar Documents

Publication Publication Date Title
CN108550101B (en) Image processing method, device and storage medium
CN110544258B (en) Image segmentation method and device, electronic equipment and storage medium
EP3627440B1 (en) Image processing method and apparatus
CN108564082B (en) Image processing method, device, server and medium
CN107622504B (en) Method and device for processing pictures
CN112241714B (en) Method and device for identifying designated area in image, readable medium and electronic equipment
CN113126937B (en) Display terminal adjusting method and display terminal
CN113379775A (en) Generating a colorized image based on interactive color edges using a colorized neural network
WO2012170462A2 (en) Automatic exposure correction of images
CN108961183B (en) Image processing method, terminal device and computer-readable storage medium
CN110049250B (en) Camera shooting state switching method and device
CN107909569B (en) Screen-patterned detection method, screen-patterned detection device and electronic equipment
CN110009555B (en) Image blurring method and device, storage medium and electronic equipment
JP5653104B2 (en) Image processing apparatus, image processing method, and program
WO2022142219A1 (en) Image background processing method, apparatus, electronic device, and storage medium
CN111368587A (en) Scene detection method and device, terminal equipment and computer readable storage medium
CN111028276A (en) Image alignment method and device, storage medium and electronic equipment
CN108198189B (en) Picture definition obtaining method and device, storage medium and electronic equipment
CN114374760A (en) Image testing method and device, computer equipment and computer readable storage medium
CN111383254A (en) Depth information acquisition method and system and terminal equipment
CN108810397B (en) Image color cast correction method and terminal equipment
CN111311500A (en) Method and device for carrying out color restoration on image
CN110310341B (en) Method, device, equipment and storage medium for generating default parameters in color algorithm
JP2016197377A (en) Computer program for image correction, image correction device, and image correction method
CN108470327B (en) Image enhancement method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination