CN114155211A - Image processing method and device, electronic equipment and storage medium - Google Patents

Image processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114155211A
CN114155211A CN202111387382.4A CN202111387382A CN114155211A CN 114155211 A CN114155211 A CN 114155211A CN 202111387382 A CN202111387382 A CN 202111387382A CN 114155211 A CN114155211 A CN 114155211A
Authority
CN
China
Prior art keywords
image
purple
marking
pixels
processed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111387382.4A
Other languages
Chinese (zh)
Inventor
毛礼建
李远沐
熊剑平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202111387382.4A priority Critical patent/CN114155211A/en
Publication of CN114155211A publication Critical patent/CN114155211A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses an image processing method, an image processing device, electronic equipment and a storage medium, which belong to the technical field of image processing, and the method comprises the following steps: the method comprises the steps of obtaining an image to be processed, marking brightness pixels and non-brightness pixels in the image to be processed by adopting different marking values to obtain a brightness marking map, marking edge pixels and non-edge pixels in the image to be processed by adopting different marking values to obtain an edge marking map, marking purple pixels and non-purple pixels in the image to be processed by adopting different marking values to obtain a color marking map, and determining whether purple edges exist in the image to be processed based on the brightness marking map, the edge marking map and the color marking map. Therefore, purple fringing detection is carried out by combining the three characteristics of brightness, purple and edges, the characteristic that purple fringing is purple edge is met, and the characteristic that purple fringing is located near a highlight area is met, so that the purple fringing in the image to be processed can be well detected, and the accuracy of purple fringing detection is improved.

Description

Image processing method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
In the field of image processing technology, purple fringing refers to a phenomenon that purple speckles appear at the boundary of a highlight part and a low light part due to large contrast of an acquired object in the process of acquiring an image by a camera, and the purple fringing affects the image quality, so that purple fringing detection is necessary for the image. How to reasonably carry out purple boundary detection and improving the detection accuracy rate are problems to be solved urgently.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, electronic equipment and a storage medium, which are used for solving the problem of low accuracy of purple fringing detection in the related art.
In a first aspect, an embodiment of the present application provides an image processing method, including:
marking brightness pixels and non-brightness pixels in an image to be processed by adopting different marking values to obtain a brightness marking image, wherein the brightness pixels are pixels with brightness larger than a set value;
marking edge pixels and non-edge pixels in the image to be processed by adopting different marking values to obtain an edge marking image;
marking purple pixels and non-purple pixels in the image to be processed by adopting different marking values to obtain a color marking image;
and determining whether purple fringing exists in the image to be processed based on the brightness mark map, the edge mark map and the color mark map.
In some embodiments, determining whether purple fringing exists in the image to be processed based on the brightness label map, the edge label map and the color label map includes:
determining purple fringing pixels in the image to be processed based on the brightness mark map, the edge mark map and the color mark map;
and determining whether purple fringing exists in the image to be processed or not based on the determined purple fringing pixels.
In some embodiments, determining whether purple fringing exists in the image to be processed based on the determined purple fringed pixels includes:
and determining whether purple fringing exists in the image to be processed or not based on the determined distribution characteristics of purple fringing pixels.
In some embodiments, determining purple fringing pixels in the image to be processed based on the brightness mark map, the edge mark map, and the color mark map includes:
in response to that any pixel in the image to be processed meets a preset judgment condition, determining that the pixel is a purple edge pixel, wherein the preset judgment condition comprises the following steps: the marking value of the pixel corresponding to the brightness marking map indicates that the pixel is a brightness pixel, the marking value of the pixel corresponding to the edge marking map indicates that the pixel is an edge pixel, and the marking value of the pixel corresponding to the color marking map indicates that the pixel is a purple color;
and determining that the pixel is not a purple-fringed pixel in response to the pixel not meeting the preset judgment condition.
In some embodiments, after determining purple fringed pixels in the image to be processed based on the brightness mark map, the edge mark map and the color mark map, the method further includes:
marking each purple border pixel and each non-purple border pixel in the image to be processed by adopting different marking values to obtain a purple border marking image;
determining whether purple fringing exists in the image to be processed based on the determined distribution characteristics of purple fringing pixels, wherein the determining comprises the following steps:
and determining whether the purple fringing exists in the image to be processed or not based on the distribution characteristics of the determined purple fringing pixels in the purple fringing mark picture.
In some embodiments, determining whether the image to be processed has purple fringing based on the distribution characteristics of the determined purple fringed pixels in the purple fringed marking map includes:
dividing the purple fringed label graph into preset image blocks;
and determining whether the image to be processed has purple fringing or not based on the number of purple fringing pixels in at least one image block in the preset image blocks.
In some embodiments, determining whether purple fringing exists in the image to be processed based on the number of purple fringing pixels in at least one image block of the preset image blocks includes:
responding to the fact that the number of purple fringed pixels in any image block exceeds a first preset value, and determining that any image block is a purple fringed block;
and determining that purple fringing exists in the image to be processed in response to the number of purple fringed blocks exceeding a second preset value.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including:
the brightness marking module is used for marking brightness pixels and non-brightness pixels in the image to be processed by adopting different marking values to obtain a brightness marking image, wherein the brightness pixels are pixels with brightness larger than a set value;
the edge marking module is used for marking edge pixels and non-edge pixels in the image to be processed by adopting different marking values to obtain an edge marking image;
the color marking module is used for marking purple pixels and non-purple pixels in the image to be processed by adopting different marking values to obtain a color marking image;
a determining module, configured to determine whether purple fringing exists in the image to be processed based on the brightness label map, the edge label map, and the color label map.
In a third aspect, an embodiment of the present application provides an electronic device, including: at least one processor, and a memory communicatively coupled to the at least one processor, wherein:
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the image processing method described above.
In a fourth aspect, embodiments of the present application provide a storage medium, where instructions are executed by a processor of an electronic device, and the electronic device is capable of executing the image processing method.
In the embodiment of the application, a brightness pixel and a non-brightness pixel in an image to be processed are marked by adopting different marking values to obtain a brightness marking map, an edge pixel and a non-edge pixel in the image to be processed are marked by adopting different marking values to obtain an edge marking map, a purple pixel and a non-purple pixel in the image to be processed are marked by adopting different marking values to obtain a color marking map, and whether a purple edge exists in the image to be processed is determined based on the brightness marking map, the edge marking map and the color marking map. Therefore, purple fringing detection is carried out by combining the three characteristics of brightness, purple and edges, the characteristic that purple fringing is purple edge is met, and the characteristic that purple fringing is located near a highlight area is met, so that the purple fringing in the image to be processed can be well detected, and the accuracy of purple fringing detection is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is an application scene diagram of an image processing method according to an embodiment of the present application;
fig. 2 is a flowchart of an image processing method according to an embodiment of the present application;
fig. 3 is a flowchart of a method for determining whether a purple fringing exists in an image to be processed according to an embodiment of the present application;
fig. 4 is a flowchart of another method for determining whether a purple fringing exists in an image to be processed according to an embodiment of the present application;
fig. 5 is a schematic view of a purple fringing detection process provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 7 is a schematic diagram of a hardware structure of an electronic device for implementing an image processing method according to an embodiment of the present application.
Detailed Description
In order to solve the problem of relatively low accuracy of purple fringing detection in the related art, embodiments of the present application provide an image processing method, an image processing apparatus, an electronic device, and a storage medium.
Some key terms in the embodiments of the present application will be described first:
purple fringed: from the image acquisition technology, purple fringing refers to the phenomenon that color spots appear at the junction of high-light and low-light parts due to large contrast of a shot object in the shooting process of a digital camera; purple fringed generally purple and other colors are possible; generally, the cause of the purple fringing is related to dispersion of a camera lens, a small imaging area of a Charge Coupled Device (CCD) (high imaging cell density), a signal processing algorithm inside the camera, and the like.
The preferred embodiments of the present application will be described below with reference to the accompanying drawings of the specification, it should be understood that the preferred embodiments described herein are merely for illustrating and explaining the present application, and are not intended to limit the present application, and that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
Fig. 1 is an application scene diagram of an image processing method according to an embodiment of the present application, and the application scene diagram includes an acquisition device and a computer device, where the acquisition device, such as various types of cameras, may be disposed in various places where legal monitoring may be performed, such as a traffic intersection, a public building, and the like, and is used to acquire video data and send the video data to the computer device. And the computer equipment is used for storing the video data sent by the acquisition equipment and analyzing and processing the video data, such as analyzing whether purple edges exist in images in the video data or not, analyzing whether illegal behaviors and illegal behaviors exist in the images in the video data or not and the like.
In specific implementation, the acquisition device and the computer device can be in communication connection through one or more networks. The network may be a wired network or a WIreless network, for example, the WIreless network may be a mobile cellular network, or may be a WIreless-Fidelity (WIFI) network, and of course, may also be other possible networks, which is not limited in this embodiment of the present application.
After an exemplary application scenario of the embodiment of the present application is introduced, the image processing proposed by the present application is described below with a specific embodiment. Fig. 2 is a flowchart of an image processing method according to an embodiment of the present application, which can be applied to the computer device in fig. 1 and includes the following steps.
In step S201, the luminance pixel and the non-luminance pixel in the image to be processed are marked with different marking values, so as to obtain a luminance mark map.
The image to be processed may be an image acquired by a camera in real time, or may be a historical image acquired by the camera. The luminance pixel refers to a pixel having luminance greater than a set value.
Taking the example that the image to be processed is an RGB image, the pixel (i, j) in the ith row and the jth column in the image to be processed has three values of R (i, j), G (i, j) and B (i, j), if the three values satisfy
Figure BDA0003367571420000051
The pixel (i, j) can be determined to be a luminance pixel if it is not satisfied
Figure BDA0003367571420000052
It may be determined that pixel (i, j) is not a luminance pixel (i.e., is a non-luminance pixel).
Then, the luminance pixel and the non-luminance pixel in the image to be processed may be marked with different marking values, for example, 255 is used for marking the luminance pixel in the image to be processed, and 0 is used for marking the non-luminance pixel in the image to be processed, so as to obtain a luminance marking map. The pixels in the brightness label map correspond to the pixels in the image to be processed one by one.
In step S202, edge pixels and non-edge pixels in the image to be processed are marked with different marking values, so as to obtain an edge marking map.
The edge pixels can be determined according to the edge detection result, and the non-edge pixels except the edge pixels are determined.
The following describes a process for determining edge pixels in an image to be processed, taking canny edge detection as an example.
In specific implementation, the image to be processed can be converted into a gray image, gaussian filtering is performed, and then the gradient strength and direction of the filtered image are calculated as follows:
Figure BDA0003367571420000061
Figure BDA0003367571420000062
wherein g (i, j) is the gradient intensity of the pixel (i, j) positioned in the ith row and the jth column in the gray-scale image; θ (i, j) is the gradient direction of the pixel (i, j), gx(i, j) and gy(i, j) are the gradient strengths of the pixel (i, j) in the x and y directions, respectively.
And then, non-edge points are removed through non-maximum value inhibition, and the fuzzy boundary becomes clear so as to eliminate edge false detection.
For example, for the pixel (i, j), the gradients of the pixel (i, j) and the pixels on both sides are compared along the gradient θ (i, j), if g (i, j) of the pixel (i, j) is the maximum, the pixel (i, j) is determined to be an edge pixel, otherwise, the pixel (i, j) is determined not to be an edge pixel (i.e., a non-edge pixel).
Further, edge pixels and non-edge pixels in the image to be processed may be marked with different marking values, for example, 255 is used for marking edge pixels in the image to be processed, and 0 is used for marking non-edge pixels in the image to be processed, so as to obtain an edge marking map, where pixels in the edge marking map correspond to pixels in the image to be processed one to one.
In step S203, the purple pixels and the non-purple pixels in the image to be processed are marked with different marking values, so as to obtain a color marking map.
Still taking the example that the image to be processed is an RGB image, if the pixel (i, j) in the image to be processed satisfies:
Figure BDA0003367571420000071
the pixel (i, j) is determined to be a purple pixel, and if not, the pixel (i, j) is determined not to be a purple pixel (i.e., a non-purple pixel).
Then, edge pixels and non-edge pixels in the image to be processed can be marked by adopting different marking values, for example, purple pixels in the image to be processed are marked by adopting True, and non-purple pixels in the image to be processed are marked by adopting False, so that a color marking image can be obtained, wherein the pixels in the purple marking image correspond to the pixels in the image to be processed one by one.
In step S204, it is determined whether purple fringing exists in the image to be processed based on the luminance marker map, the edge marker map, and the color marker map.
In some embodiments, whether purple fringing exists in the image to be processed can be determined according to the flow shown in fig. 3, which includes the following steps.
In step S301a, purple fringing pixels in the image to be processed are determined based on the luminance mark map, the edge mark map, and the color mark map.
In some embodiments, the preset determination condition may be: the marking value of the pixel corresponding to the brightness marking map indicates that the pixel is a brightness pixel, the marking value of the pixel corresponding to the edge marking map indicates that the pixel is an edge pixel, and the marking value of the pixel corresponding to the color marking map indicates that the pixel is a purple color. And for each pixel in the image to be processed, if the pixel meets a preset judgment condition, determining that the pixel is a purple-fringed pixel, and if the pixel does not meet the preset judgment condition, determining that the pixel is not a purple-fringed pixel (namely, a non-purple-fringed pixel).
In step S302a, based on the determined purple fringing pixel, it is determined whether or not there is a purple fringing in the image to be processed.
Generally, when purple fringing exists in an image, purple fringing pixels in the image are distributed continuously rather than scattered, so whether purple fringing exists in the image to be processed can be determined based on the determined distribution characteristics of the purple fringing pixels. That is, when the determined purple boundary pixels have the continuous distribution feature, it may be determined that purple boundaries exist in the image to be processed.
Therefore, the purple-fringed pixels in the image to be processed are determined by combining the brightness mark map, the edge mark map and the color mark map corresponding to the image to be processed, and then whether purple fringed exists in the image to be processed is determined by further combining the determined distribution characteristics of the purple-fringed pixels, so that the probability of false purple fringed recognition is reduced, and the accuracy of purple fringed detection is improved.
In some embodiments, whether purple fringing exists in the image to be processed can also be determined according to the flow shown in fig. 4, and the flow includes the following steps.
In step S401a, purple fringing pixels in the image to be processed are determined based on the luminance mark map, the edge mark map, and the color mark map.
The implementation of this step can be seen in S301a, and is not described herein.
In step S402a, purple fringed pixels and non-purple fringed pixels in the image to be processed are marked with different marking values, so as to obtain a purple fringed marking map.
For example, the purple border pixels in the image to be processed are marked by 255, and the non-purple border pixels in the image to be processed are marked by 0, so that a purple border marked image can be obtained, and the pixels in the purple border marked image correspond to the pixels in the image to be processed one by one.
In step S403a, it is determined whether or not the image to be processed has purple fringing based on the distribution characteristics of the determined purple fringed pixels in the purple fringed marker map.
In specific implementation, the purple fringing mark map can be divided into preset image blocks, and then whether purple fringing exists in the image to be processed is determined based on the number of purple fringing pixels in at least one image block in the preset image blocks.
For example, for any image block in the preset image blocks, if the number of purple-edge pixels in the image block exceeds a first preset value, the image block is determined to be a purple-edge block, and if the number of purple-edge pixels in the image block does not exceed the first preset value, the image block is determined not to be the purple-edge block. Further, if the number of purple edge blocks exceeds a second preset value, it is determined that purple edges exist in the image to be processed, and if the number of purple edge blocks does not exceed the second preset value, it is determined that purple edges do not exist in the image to be processed.
Therefore, the purple-fringed pixels in the image to be processed are determined by combining the brightness mark map, the edge mark map and the color mark map corresponding to the image to be processed, then the purple-fringed pixels and the non-purple-fringed pixels are marked by adopting different mark values to obtain the purple-fringed mark map, and finally whether purple fringed exists in the image to be processed is determined based on the distribution characteristics of the purple-fringed pixels in the purple-fringed mark map, so that the probability of false purple-fringed recognition is favorably further reduced, and the accuracy of purple-fringed detection is improved.
In addition, in consideration of the fact that purple fringing usually occurs at the edge of a highlight area, in order to avoid missing detection of purple fringing pixel points, in the above flow, before carrying out purple fringing marking on an image to be processed based on a brightness marking map, an edge marking map and a color marking map to obtain a purple fringing marking map, expansion processing can also be carried out on the brightness marking map.
In specific implementation, the brightness marker image may be dilated using the following formula:
Figure BDA0003367571420000091
where A is the luminance map, B is the dilated template, e.g., 3 x 3 in size, x represents the pixels in B,
Figure BDA0003367571420000092
representing a convolution operation and theta represents an empty set.
Therefore, pixels which are originally positioned at the edge of the brightness area in the image to be processed can be marked as brightness pixels, the area occupied by the brightness pixels is enlarged, and the purple pixels which are originally positioned at the edge of the brightness area are prevented from being missed, so that the accuracy of purple edge detection can be improved.
The embodiments of the present application will be described below with reference to specific examples.
Fig. 5 is a schematic view of a purple fringing detection process according to an embodiment of the present application, where a to-be-processed image is respectively subjected to highlight pixel screening, edge pixel screening, and purple pixel screening, purple fringing pixel points in the to-be-processed image are marked based on a highlight pixel screening result, an edge pixel screening result, and a purple pixel screening result to obtain a purple fringing mark map, and then the purple fringing mark map is partitioned, and an image block is divided into a purple fringed block and a non-purple fringed block based on whether the number of purple fringed pixel points included in the image block exceeds a first preset value, and whether the to-be-processed image includes purple fringed edges is determined according to whether the number of purple fringed blocks exceeds a second preset value.
Taking the image to be processed as an RGB image as an example, the above process may be performed as follows.
The method comprises the following steps: and screening the highlight pixels of the image to be processed.
The pixel (i, j) of the ith row and the jth column in the image to be processed has three values of R (i, j), G (i, j) and B (i, j), if the three values satisfy
Figure BDA0003367571420000101
The pixel (i, j) can be determined to be a luminance pixel if it is not satisfied
Figure BDA0003367571420000102
Then pixel (i, j) may be determined to be a non-luminance pixel.
Then, the luminance pixels in the image to be processed are marked with 255, and the non-luminance pixels in the image to be processed are marked with 0, so that a luminance marked image is obtained.
In consideration of the fact that purple fringing usually occurs near a highlight region, in order to highlight pixels located near the highlight region as well, and improve the accuracy of purple fringing detection, the luminance label map may be subjected to dilation:
Figure BDA0003367571420000103
wherein A is a luminance map, B is a template for expansion processing, the size of the template is 3X 3, and x representsThe number of pixels in B is greater than the number of pixels in B,
Figure BDA0003367571420000104
represents the convolution operation, theta represents the empty set, and f represents the luminance map after the dilation process1
Step two: and screening edge pixels of the image to be processed.
Taking canny edge detection as an example, firstly, converting an image to be processed into a gray image, performing Gaussian filtering processing, and then calculating the gradient strength and the direction of the filtered image:
Figure BDA0003367571420000105
Figure BDA0003367571420000106
wherein g (i, j) is the gradient intensity of the pixel (i, j) positioned in the ith row and the jth column in the gray-scale image; θ (i, j) is the gradient direction of the pixel (i, j), gx(i, j) and gy(i, j) are the gradient strengths of the pixel (i, j) in the x and y directions, respectively.
Then, the gradients of the pixel (i, j) and the two side pixels are compared along the gradient theta (i, j), if g (i, j) of the pixel (i, j) is the maximum, the pixel (i, j) is determined as an edge pixel, otherwise, the pixel (i, j) is determined as a non-edge pixel.
Further, 255 is adopted to mark edge pixels in the image to be processed, and 0 is adopted to mark non-edge pixels in the image to be processed, so as to obtain an edge mark image f2
Step three: and screening purple pixel points.
For example, if pixel (i, j) in the image to be processed satisfies:
Figure BDA0003367571420000111
and determining that the pixel (i, j) is a purple pixel, and if the pixel (i, j) is not the purple pixel, determining that the pixel (i, j) is a non-purple pixel.
Then, marking purple pixels in the image to be processed by adopting True, and marking non-purple pixels in the image to be processed by adopting False to obtain a color marking image f3
Step four: based on three features f1,f2And f3Generating a purple fringed marker map P (i, j):
Figure BDA0003367571420000112
and the pixels in the P (i, j) correspond to the pixels in the image to be processed one by one.
The above conditions indicate that the purple pixels near the highlight region in the image to be processed, located at the edge of the image, are regarded as purple-edge pixel points.
Step five: p (i, j) is divided into m × n image blocks.
Wherein m and n are integers which can be specified in advance.
Step six: screening purple boundary blocks.
For example, for each image block, if the number of purple border pixel points in the image block is greater than a first preset value, determining that the image block is a purple border block; and if the number of purple border pixel points in the image block is not more than a first preset value, determining that the image block is a non-purple border block.
Step seven: and judging whether the image to be processed has purple edges or not.
For example, counting the number of purple boundary blocks in the image to be processed, and if the number of purple boundary blocks is greater than a second preset value, determining that purple boundaries exist in the image to be processed; and if the number of the purple edge blocks is not larger than the second preset value, determining that the purple edges do not exist in the image to be processed.
In the embodiment of the application, according to the rule of purple fringing in the image, the purple fringing is detected by utilizing three characteristics of highlight, edge and purple, the characteristic that the purple fringing is purple edge is met, and the characteristic that the purple fringing is positioned near a highlight area is met, so that the false detection of non-purple fringing can be reduced. And screening purple boundary blocks firstly, and then judging whether the image to be processed is a purple boundary image or not based on the number of the purple boundary blocks, so that the method accords with the characteristic of purple boundary continuous distribution, has stronger robustness and is beneficial to further reducing the false detection condition.
When the method provided in the embodiments of the present application is implemented in software or hardware or a combination of software and hardware, a plurality of functional modules may be included in the electronic device, and each functional module may include software, hardware or a combination of software and hardware.
Based on the same technical concept, embodiments of the present application further provide an image processing apparatus, where the principle of the image processing apparatus to solve the problem is similar to that of the image processing method, so that the implementation of the image processing apparatus can refer to the implementation of the image processing method, and repeated details are not repeated.
Fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure, which includes a brightness marking module 601, an edge marking module 602, a color marking module 603, and a determining module 604.
The brightness marking module 601 is configured to mark a brightness pixel and a non-brightness pixel in the image to be processed by using different marking values to obtain a brightness marking map, where the brightness pixel is a pixel with brightness greater than a set value;
an edge marking module 602, configured to mark an edge pixel and a non-edge pixel in the image to be processed with different marking values, to obtain an edge marking map;
the color marking module 603 is configured to mark a purple pixel and a non-purple pixel in the image to be processed with different marking values to obtain a color marking map;
a determining module 604, configured to determine whether purple fringing exists in the image to be processed based on the brightness label map, the edge label map, and the color label map.
In some embodiments, the determining module 604 is specifically configured to:
determining purple fringing pixels in the image to be processed based on the brightness mark map, the edge mark map and the color mark map;
and determining whether purple fringing exists in the image to be processed or not based on the determined purple fringing pixels.
In some embodiments, the determining module 604 is specifically configured to:
and determining whether purple fringing exists in the image to be processed or not based on the determined distribution characteristics of purple fringing pixels.
In some embodiments, the determining module 604 is specifically configured to:
in response to that any pixel in the image to be processed meets a preset judgment condition, determining that the pixel is a purple edge pixel, wherein the preset judgment condition comprises the following steps: the marking value of the pixel corresponding to the brightness marking map indicates that the pixel is a brightness pixel, the marking value of the pixel corresponding to the edge marking map indicates that the pixel is an edge pixel, and the marking value of the pixel corresponding to the color marking map indicates that the pixel is a purple color;
and determining that the pixel is not a purple-fringed pixel in response to the pixel not meeting the preset judgment condition.
In some embodiments, further comprising:
a purple border marking module 605, configured to mark, after determining purple border pixels in the image to be processed based on the brightness marking map, the edge marking map, and the color marking map, each purple border pixel and each non-purple border pixel in the image to be processed with different marking values, so as to obtain a purple border marking map;
the determining module 604 is specifically configured to determine whether purple fringing exists in the image to be processed based on the distribution feature of the determined purple fringed pixels in the purple fringed marking map.
In some embodiments, the determining module 604 is specifically configured to:
dividing the purple fringed label graph into preset image blocks;
and determining whether the image to be processed has purple fringing or not based on the number of purple fringing pixels in at least one image block in the preset image blocks.
In some embodiments, the determining module 604 is specifically configured to:
responding to the fact that the number of purple fringed pixels in any image block exceeds a first preset value, and determining that any image block is a purple fringed block;
and determining that purple fringing exists in the image to be processed in response to the number of purple fringed blocks exceeding a second preset value.
The division of the modules in the embodiments of the present application is schematic, and only one logic function division is provided, and in actual implementation, there may be another division manner, and in addition, each function module in each embodiment of the present application may be integrated in one processor, may also exist alone physically, or may also be integrated in one module by two or more modules. The coupling of the various modules to each other may be through interfaces that are typically electrical communication interfaces, but mechanical or other forms of interfaces are not excluded. Thus, modules described as separate components may or may not be physically separate, may be located in one place, or may be distributed in different locations on the same or different devices. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
Having described the comparative learning method and apparatus of the exemplary embodiments of the present application, an electronic device according to another exemplary embodiment of the present application is next described.
As will be appreciated by one skilled in the art, aspects of the present application may be embodied as a system, method or program product. Accordingly, various aspects of the present application may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
In some possible implementations, an electronic device according to the present application may include at least one processor, and at least one memory. Wherein the memory stores program code which, when executed by the processor, causes the processor to perform the methods according to the various exemplary embodiments of the present application described above in the present specification. For example, the processor may perform steps in a neural network model training method or steps in a method of extracting image features, such as based on contrast learning.
The electronic device 130 according to this embodiment of the present application is described below with reference to fig. 7. The electronic device 130 shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the electronic device 130 is represented in the form of a general electronic device. The components of the electronic device 130 may include, but are not limited to: the at least one processor 131, the at least one memory 132, and a bus 133 that connects the various system components (including the memory 132 and the processor 131).
Bus 133 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
The memory 132 may include readable media in the form of volatile memory, such as Random Access Memory (RAM)1321 and/or cache memory 1322, and may further include Read Only Memory (ROM) 1323.
Memory 132 may also include a program/utility 1325 having a set (at least one) of program modules 1324, such program modules 1324 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The electronic device 130 may also communicate with one or more external devices 134 (e.g., keyboard, pointing device, etc.), with one or more devices that enable a user to interact with the electronic device 130, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 130 to communicate with one or more other electronic devices. Such communication may occur via input/output (I/O) interfaces 135. Also, the electronic device 130 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 136. As shown, network adapter 136 communicates with other modules for electronic device 130 over bus 133. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 130, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In an exemplary embodiment, a computer-readable storage medium comprising instructions, such as the memory 132 comprising instructions, executable by the processor 131 to perform the contrast learning method described above is also provided. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, comprising a computer program which, when executed by the processor 131, implements the exemplary method as provided herein.
In an exemplary embodiment, various aspects of a method for training a neural network model based on contrast learning and a method for extracting image features provided by the present application may also be implemented in the form of a program product, which includes program code for causing a computer device to perform the steps of the method according to various exemplary embodiments of the present application described above in this specification when the program product is run on the computer device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable Disk, a hard Disk, a RAM, a ROM, an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a Compact Disk Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The program product for image processing in the embodiment of the present application may be a CD-ROM and include program codes, and may be run on a computing device. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, Radio Frequency (RF), etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In situations involving remote computing devices, the remote computing devices may be connected to the user computing device over any kind of Network, such as a Local Area Network (LAN) or Wide Area Network (WAN), or may be connected to external computing devices (e.g., over the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such division is merely exemplary and not mandatory. Indeed, the features and functions of two or more units described above may be embodied in one unit, according to embodiments of the application. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Further, while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. An image processing method, comprising:
marking brightness pixels and non-brightness pixels in an image to be processed by adopting different marking values to obtain a brightness marking image, wherein the brightness pixels are pixels with brightness larger than a set value;
marking edge pixels and non-edge pixels in the image to be processed by adopting different marking values to obtain an edge marking image;
marking purple pixels and non-purple pixels in the image to be processed by adopting different marking values to obtain a color marking image;
and determining whether purple fringing exists in the image to be processed based on the brightness mark map, the edge mark map and the color mark map.
2. The method of claim 1, wherein determining whether purple fringing exists in the image to be processed based on the brightness marker map, the edge marker map, and the color marker map comprises:
determining purple fringing pixels in the image to be processed based on the brightness mark map, the edge mark map and the color mark map;
and determining whether purple fringing exists in the image to be processed or not based on the determined purple fringing pixels.
3. The method of claim 2, wherein determining whether purple fringing exists in the image to be processed based on the determined purple fringed pixels comprises:
and determining whether purple fringing exists in the image to be processed or not based on the determined distribution characteristics of purple fringing pixels.
4. The method of claim 2, wherein determining purple fringed pixels in the image to be processed based on the brightness marker map, the edge marker map, and the color marker map comprises:
in response to that any pixel in the image to be processed meets a preset judgment condition, determining that the pixel is a purple edge pixel, wherein the preset judgment condition comprises the following steps: the marking value of the pixel corresponding to the brightness marking map indicates that the pixel is a brightness pixel, the marking value of the pixel corresponding to the edge marking map indicates that the pixel is an edge pixel, and the marking value of the pixel corresponding to the color marking map indicates that the pixel is a purple color;
and determining that the pixel is not a purple-fringed pixel in response to the pixel not meeting the preset judgment condition.
5. The method of claim 3, after determining purple-fringed pixels in the image to be processed based on the brightness marker map, the edge marker map, and the color marker map, further comprising:
marking each purple border pixel and each non-purple border pixel in the image to be processed by adopting different marking values to obtain a purple border marking image;
determining whether purple fringing exists in the image to be processed based on the determined distribution characteristics of purple fringing pixels, wherein the determining comprises the following steps:
and determining whether the purple fringing exists in the image to be processed or not based on the distribution characteristics of the determined purple fringing pixels in the purple fringing mark picture.
6. The method of claim 5, wherein determining whether the image to be processed has purple fringing based on the distribution characteristics of the determined purple fringed pixels in the purple fringed marking map comprises:
dividing the purple fringed label graph into preset image blocks;
and determining whether the image to be processed has purple fringing or not based on the number of purple fringing pixels in at least one image block in the preset image blocks.
7. The method as claimed in claim 6, wherein determining whether purple fringing exists in the image to be processed based on the number of purple fringing pixels in at least one image block of the preset image blocks comprises:
responding to the fact that the number of purple fringed pixels in any image block exceeds a first preset value, and determining that any image block is a purple fringed block;
and determining that purple fringing exists in the image to be processed in response to the number of purple fringed blocks exceeding a second preset value.
8. An image processing apparatus characterized by comprising:
the brightness marking module is used for marking brightness pixels and non-brightness pixels in the image to be processed by adopting different marking values to obtain a brightness marking image, wherein the brightness pixels are pixels with brightness larger than a set value;
the edge marking module is used for marking edge pixels and non-edge pixels in the image to be processed by adopting different marking values to obtain an edge marking image;
the color marking module is used for marking purple pixels and non-purple pixels in the image to be processed by adopting different marking values to obtain a color marking image;
a determining module, configured to determine whether purple fringing exists in the image to be processed based on the brightness label map, the edge label map, and the color label map.
9. An electronic device, comprising: at least one processor, and a memory communicatively coupled to the at least one processor, wherein:
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
10. A storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the method of any of claims 1-7.
CN202111387382.4A 2021-11-22 2021-11-22 Image processing method and device, electronic equipment and storage medium Pending CN114155211A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111387382.4A CN114155211A (en) 2021-11-22 2021-11-22 Image processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111387382.4A CN114155211A (en) 2021-11-22 2021-11-22 Image processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114155211A true CN114155211A (en) 2022-03-08

Family

ID=80457237

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111387382.4A Pending CN114155211A (en) 2021-11-22 2021-11-22 Image processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114155211A (en)

Similar Documents

Publication Publication Date Title
CN111311482B (en) Background blurring method and device, terminal equipment and storage medium
CN109583345B (en) Road recognition method, device, computer device and computer readable storage medium
US20160323505A1 (en) Photographing processing method, device and computer storage medium
CN111310746B (en) Text line detection method, model training method, device, server and medium
CN109951635B (en) Photographing processing method and device, mobile terminal and storage medium
CN111368587B (en) Scene detection method, device, terminal equipment and computer readable storage medium
CN109214996B (en) Image processing method and device
JP7429756B2 (en) Image processing method, device, electronic device, storage medium and computer program
CN111767915A (en) License plate detection method, device, equipment and storage medium
CN111311481A (en) Background blurring method and device, terminal equipment and storage medium
US20120057796A1 (en) Apparatus and method of reducing noise
CN112967207A (en) Image processing method and device, electronic equipment and storage medium
CN109255311B (en) Image-based information identification method and system
CN113286086B (en) Camera use control method and device, electronic equipment and storage medium
CN112581374A (en) Speckle sub-pixel center extraction method, system, device and medium
CN113658196A (en) Method and device for detecting ship in infrared image, electronic equipment and medium
CN113628259A (en) Image registration processing method and device
CN113888509A (en) Method, device and equipment for evaluating image definition and storage medium
US20230048649A1 (en) Method of processing image, electronic device, and medium
CN114155211A (en) Image processing method and device, electronic equipment and storage medium
JP2006004124A (en) Picture correction apparatus and method, and picture correction program
CN115705622A (en) Image processing method and device
CN113763311A (en) Image recognition method and device and automatic sorting robot
CN114596210A (en) Noise estimation method, device, terminal equipment and computer readable storage medium
CN107103321A (en) The generation method and generation system of road binary image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination