CN115578382B - Image anomaly detection method, device, equipment and computer readable storage medium - Google Patents

Image anomaly detection method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN115578382B
CN115578382B CN202211472510.XA CN202211472510A CN115578382B CN 115578382 B CN115578382 B CN 115578382B CN 202211472510 A CN202211472510 A CN 202211472510A CN 115578382 B CN115578382 B CN 115578382B
Authority
CN
China
Prior art keywords
image
pixel
detection
detection frame
calculation result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211472510.XA
Other languages
Chinese (zh)
Other versions
CN115578382A (en
Inventor
郑杨婷
陈殷齐
李佩文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202211472510.XA priority Critical patent/CN115578382B/en
Publication of CN115578382A publication Critical patent/CN115578382A/en
Application granted granted Critical
Publication of CN115578382B publication Critical patent/CN115578382B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features

Abstract

The invention discloses an image anomaly detection method, an image anomaly detection device, image anomaly detection equipment and a computer readable storage medium, wherein the method comprises the following steps: pixel-level anomaly detection is carried out on the first image and the second image to obtain pixel anomaly detection results corresponding to all pixel points in the first image, wherein the pixel anomaly detection results represent whether the corresponding pixel points in the first image change relative to the pixel points at the same position in the second image; searching a dense change area with the change density meeting a preset density condition in the first image, wherein the change density of the area is calculated according to a pixel abnormity detection result; and taking the dense change area as an area abnormity detection result between the first image and the second image. The method improves the robustness of the image anomaly detection result on the noise influence, and further improves the reference value of the image anomaly detection result.

Description

Image anomaly detection method, device, equipment and computer readable storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method, an apparatus, a device, and a computer-readable storage medium for detecting an image anomaly.
Background
Image anomaly detection is the detection of areas where one image changes compared to another. The current anomaly detection methods are various, but all of them mark an anomaly (changed) area on a pixel level, that is, whether each pixel point in an image changes or not is detected. The pixel-level labeling has poor robustness to noise influence, that is, small noise points are detected as changed areas, so that the detection result lacks reference value.
Disclosure of Invention
The invention mainly aims to provide an image anomaly detection method, an image anomaly detection device, image anomaly detection equipment and a computer readable storage medium, and aims to improve the robustness of an image anomaly detection result on noise influence and further improve the reference value of the image anomaly detection result.
In order to achieve the above object, the present invention provides an image anomaly detection method, including the steps of:
performing pixel-level anomaly detection on a first image and a second image to obtain pixel anomaly detection results corresponding to all pixel points in the first image, wherein the pixel anomaly detection results represent whether the corresponding pixel points in the first image change relative to the pixel points at the same position in the second image;
searching a dense change area with the change density meeting a preset density condition in the first image, wherein the change density of the area is calculated according to the pixel abnormity detection result;
and taking the dense change area as an area abnormity detection result between the first image and the second image.
Optionally, the step of searching for a dense variation region in the first image, where the variation density meets a preset density condition, includes:
initializing a position parameter of a detection frame in the first image;
calculating a gradient value of an objective function relative to the position parameter according to the pixel abnormity detection result, wherein the objective function is a function constructed by taking the maximum change intensity of the framed area of the detection frame in the first image as a target;
updating the position parameter according to the gradient value, and detecting whether the updated position parameter meets a preset convergence condition;
if so, taking the area framed by the detection frame in the first image after updating the position parameters as an intensive change area;
and if not, returning to execute the step of calculating the gradient value of the target function relative to the position parameter according to the pixel abnormity detection result based on the updated position parameter.
Optionally, the detection frame is a rectangular frame, and the step of initializing a position parameter of the detection frame in the first image includes:
initializing position parameters of the detection frame in the first image, wherein the position parameters comprise a first size transformation parameter, a second size transformation parameter and an abscissa and an ordinate of a vertex of a lower left corner of the detection frame in the first image, and the first size transformation parameter and the second size transformation parameter are respectively used for transforming the width and the height of the first image to obtain the size of the detection frame.
Optionally, the value range of the pixel abnormality detection result is a first value and a second value, the first value is a positive number, the second value is a negative number, and the absolute value of the first value is the same as the absolute value of the second value, the first value indicates that the corresponding pixel point in the first image is unchanged relative to the pixel point at the same position in the second image, and the second value indicates that the corresponding pixel point in the first image is changed relative to the pixel point at the same position in the second image;
the step of calculating a gradient value of an objective function with respect to the position parameter according to the pixel abnormality detection result includes:
calculating the sum of the pixel abnormity detection results of all pixel points of which the vertical coordinates are the vertical coordinates of the top point at the lower left corner of the detection frame in the first image to obtain a first calculation result;
multiplying the height of the first image by the second size conversion parameter, adding the vertical coordinate of the vertex at the lower left corner of the detection frame to obtain the vertical coordinate of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormity detection results of all pixel points of which the vertical coordinate is the vertical coordinate of the vertex at the upper right corner of the detection frame in the second image to obtain a second calculation result;
subtracting the second calculation result from the first calculation result to obtain a third calculation result, taking the third calculation result as a gradient value of the target function relative to the abscissa of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the third calculation result by the width of the first image as a gradient value of the target function relative to the first size transformation parameter;
calculating the sum of the pixel abnormity detection results of all pixel points of which the abscissa in the first image is the abscissa of the vertex at the lower left corner of the detection frame to obtain a fourth calculation result;
multiplying the width of the first image by the first size conversion parameter, adding the abscissa of the vertex at the lower left corner of the detection frame to obtain the abscissa of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormality detection results of all pixels of which the abscissa in the second image is the abscissa of the vertex at the upper right corner of the detection frame to obtain a fifth calculation result;
and subtracting the fifth calculation result from the fourth calculation result to obtain a sixth calculation result, taking the sixth calculation result as a gradient value of the target function relative to the ordinate of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the sixth calculation result by the height of the first image as a gradient value of the target function relative to the second size transformation parameter.
Optionally, after the step of using the area framed in the first image by the detection frame after updating the position parameter as a dense change area, the method further includes:
and after the pixel abnormity detection result of each pixel point in the dense change area in the first image is reset to 0, returning to the step of executing the position parameter of the initialization detection frame in the first image.
Optionally, the step of performing pixel-level anomaly detection on the first image and the second image to obtain pixel anomaly detection results corresponding to each pixel point in the first image includes:
after a first image and a second image are obtained, taking one of the first image and the second image as a reference image and the other as an image to be transformed;
extracting feature matching point pairs in the reference image and the image to be transformed;
calculating an affine transformation matrix of the image to be transformed relative to the reference image according to the feature matching point pairs;
transforming the image to be transformed according to the affine transformation matrix to obtain a registration image;
and carrying out pixel-level anomaly detection on the reference image and the registration image to obtain pixel anomaly detection results corresponding to all pixel points in the first image.
Optionally, after the step of using the dense change region as a region anomaly detection result between the first image and the second image, the method further includes:
and labeling the dense change area in the first image, and outputting the labeled first image.
In order to achieve the above object, the present invention also provides an image abnormality detection apparatus including:
the detection module is used for performing pixel-level anomaly detection on a first image and a second image to obtain pixel anomaly detection results corresponding to all pixel points in the first image, wherein the pixel anomaly detection results represent whether the corresponding pixel points in the first image change relative to the pixel points at the same position in the second image;
the searching module is used for searching a dense change area with the change density meeting a preset density condition in the first image, wherein the change density of the area is calculated according to the pixel abnormity detection result;
and the determining module is used for taking the dense change area as an area abnormity detection result between the first image and the second image.
In order to achieve the above object, the present invention also provides an image abnormality detection apparatus including: the image anomaly detection method comprises a memory, a processor and an image anomaly detection program which is stored on the memory and can run on the processor, wherein the image anomaly detection program realizes the steps of the image anomaly detection method when being executed by the processor.
Further, to achieve the above object, the present invention also proposes a computer readable storage medium having stored thereon an image abnormality detection program which, when executed by a processor, implements the steps of the image abnormality detection method as described above.
In the invention, pixel-level anomaly detection is carried out on a first image and a second image to obtain pixel anomaly detection results corresponding to all pixel points in the first image respectively, wherein the pixel anomaly detection results represent whether the corresponding pixel points in the first image change relative to the pixel points at the same position in the second image; searching a dense change area with the change density meeting a preset density condition in the first image, wherein the change density of the area is calculated according to a pixel abnormity detection result; and taking the dense change area as an area abnormity detection result between the first image and the second image. According to the invention, based on the pixel-level abnormal detection result, the dense change area with the change density meeting the preset density condition in the image is further searched, the area with more concentrated change in the image can be searched as the abnormal detection result, and the dispersed, non-concentrated and small change area caused by noise in the pixel-level abnormal detection result is filtered, so that the robustness of the image abnormal detection result on the noise influence is improved, and the reference value of the image abnormal detection result is further improved.
Drawings
FIG. 1 is a schematic diagram of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method for detecting image anomalies according to a first embodiment of the present invention;
FIG. 3 is a diagram illustrating a dense change area marked on a pixel-level label graph according to an embodiment of the present invention;
FIG. 4 is a functional block diagram of an image anomaly detection apparatus according to a preferred embodiment of the present invention.
The implementation, functional features and advantages of the present invention will be further described with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, fig. 1 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present invention.
It should be noted that, in the embodiment of the present invention, the image anomaly detection device may be a video capture device, a smart phone, a personal computer, a server, or the like, and is not limited herein.
As shown in fig. 1, the image abnormality detection apparatus may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the device configuration shown in fig. 1 does not constitute a limitation of the image abnormality detection device, and may include more or less components than those shown, or some components in combination, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an image abnormality detection program. The operating system is a program that manages and controls the hardware and software resources of the device, supporting the execution of the image anomaly detection program and other software or programs. In the device shown in fig. 1, the user interface 1003 is mainly used for data communication with a client; the network interface 1004 is mainly used for establishing communication connection with a server; and the processor 1001 may be configured to call the image anomaly detection program stored in the memory 1005 and perform the following operations:
performing pixel-level anomaly detection on a first image and a second image to obtain pixel anomaly detection results corresponding to all pixel points in the first image, wherein the pixel anomaly detection results represent whether the corresponding pixel points in the first image change relative to the pixel points at the same position in the second image;
searching a dense change area with the change density meeting a preset density condition in the first image, wherein the change density of the area is calculated according to the pixel abnormity detection result;
and taking the dense change area as an area abnormity detection result between the first image and the second image.
Further, the operation of searching for the dense variation region with the variation density meeting the preset density condition in the first image comprises:
initializing a position parameter of a detection frame in the first image;
calculating a gradient value of an objective function relative to the position parameter according to the pixel anomaly detection result, wherein the objective function is a function constructed by taking the maximum change intensity of the framed area of the detection frame in the first image as a target;
updating the position parameter according to the gradient value, and detecting whether the updated position parameter meets a preset convergence condition;
if so, taking the area framed by the detection frame in the first image after updating the position parameters as an intensive change area;
and if not, returning to execute the operation of calculating the gradient value of the target function relative to the position parameter according to the pixel abnormity detection result based on the updated position parameter.
Further, the detection frame is a rectangular frame, and the operation of initializing the position parameter of the detection frame in the first image includes:
initializing position parameters of the detection frame in the first image, wherein the position parameters comprise a first size transformation parameter, a second size transformation parameter and an abscissa and an ordinate of a vertex of a lower left corner of the detection frame in the first image, and the first size transformation parameter and the second size transformation parameter are respectively used for transforming the width and the height of the first image to obtain the size of the detection frame.
Further, the value range of the pixel anomaly detection result is a first numerical value and a second numerical value, the first numerical value is a positive number, the second numerical value is a negative number, the absolute value of the first numerical value is the same as the absolute value of the second numerical value, the first numerical value represents that the corresponding pixel point in the first image has no change relative to the pixel point at the same position in the second image, and the second numerical value represents that the corresponding pixel point in the first image has a change relative to the pixel point at the same position in the second image;
the operation of calculating a gradient value of an objective function with respect to the position parameter according to the pixel abnormality detection result includes:
calculating the sum of the pixel abnormity detection results of all pixel points of which the vertical coordinates are the vertical coordinates of the top point at the lower left corner of the detection frame in the first image to obtain a first calculation result;
multiplying the height of the first image by the second size conversion parameter, adding the vertical coordinate of the vertex at the lower left corner of the detection frame to obtain the vertical coordinate of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormity detection results of all pixels of which the vertical coordinate is the vertical coordinate of the vertex at the upper right corner of the detection frame in the second image to obtain a second calculation result;
subtracting the second calculation result from the first calculation result to obtain a third calculation result, taking the third calculation result as a gradient value of the target function relative to the abscissa of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the third calculation result by the width of the first image as a gradient value of the target function relative to the first size transformation parameter;
calculating the sum of the pixel abnormity detection results of all pixel points of which the abscissa in the first image is the abscissa of the vertex at the lower left corner of the detection frame to obtain a fourth calculation result;
multiplying the width of the first image by the first size conversion parameter, adding the abscissa of the vertex at the lower left corner of the detection frame to obtain the abscissa of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormality detection results of all pixels of which the abscissa in the second image is the abscissa of the vertex at the upper right corner of the detection frame to obtain a fifth calculation result;
and subtracting the fifth calculation result from the fourth calculation result to obtain a sixth calculation result, taking the sixth calculation result as a gradient value of the target function relative to the ordinate of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the sixth calculation result by the height of the first image as a gradient value of the target function relative to the second size transformation parameter.
Further, after the operation of setting the area framed in the first image as the dense change area of the detection frame after updating the position parameter, the processor 1001 may be further configured to call an image anomaly detection program stored in the memory 1005, and perform the following operations:
and after the pixel abnormity detection result of each pixel point in the dense change area in the first image is reset to 0, returning to execute the operation of initializing the position parameter of the detection frame in the first image.
Further, the operation of performing pixel-level anomaly detection on the first image and the second image to obtain pixel anomaly detection results corresponding to each pixel point in the first image includes:
after a first image and a second image are obtained, taking one of the first image and the second image as a reference image and taking the other image as an image to be transformed;
extracting feature matching point pairs in the reference image and the image to be transformed;
calculating an affine transformation matrix of the image to be transformed relative to the reference image according to the feature matching point pairs;
transforming the image to be transformed according to the affine transformation matrix to obtain a registration image;
and carrying out pixel-level anomaly detection on the reference image and the registration image to obtain pixel anomaly detection results corresponding to all pixel points in the first image.
Further, after the operation of using the dense change area as the area anomaly detection result between the first image and the second image, the processor 1001 may be further configured to call an image anomaly detection program stored in the memory 1005, and perform the following operations:
and labeling the dense change area in the first image, and outputting the labeled first image.
Based on the above structure, various embodiments of the image abnormality detection method are proposed.
Referring to fig. 2, fig. 2 is a flowchart illustrating a method for detecting image anomalies according to a first embodiment of the present invention.
While a logical order is shown in the flow chart, in some cases, the steps shown or described may be performed in an order different than that shown or described herein. In this embodiment, the execution subject of the image anomaly detection method may be a device such as a personal computer, a smart phone, a server, and the like, which is not limited in this embodiment. Hereinafter, for convenience of description, the explanation of the embodiments will be omitted by the execution main body. In this embodiment, the image abnormality detection method includes:
step S10, performing pixel-level anomaly detection on a first image and a second image to obtain pixel anomaly detection results corresponding to all pixel points in the first image, wherein the pixel anomaly detection results represent whether the corresponding pixel points in the first image change relative to the pixel points at the same position in the second image;
two images that need to be subjected to change detection are referred to as a first image and a second image for distinguishing, where the second image is a reference image of the first image, that is, an area that needs to be changed in the first image compared to the second image is detected.
The manner of acquiring the first image and the second image is not limited in this embodiment, for example, the first image and the second image may be two images captured by the same camera at different time points of the same capture area.
In this embodiment, on the basis of the detection result of the pixel-level anomaly detection, a region with high change density is further analyzed, and the region is used as a final anomaly detection result instead of using some broken pixel blocks, edges of sawteeth or disordered noise points as an anomaly detection result, so that the robustness of the image anomaly detection result on noise influence is improved, and the reference value of the image anomaly detection result is further improved.
The pixel-level anomaly detection can be performed on the first image and the second image to obtain pixel anomaly detection results corresponding to all pixel points in the first image. The pixel-level anomaly detection refers to detection by an anomaly detection method capable of obtaining a pixel-level detection result, where the pixel-level detection result is a result of whether each pixel point in an image changes, and is referred to as a pixel anomaly detection result corresponding to a pixel point in this embodiment. In this embodiment, the detection method used for detecting the pixel level anomaly is not limited, and for example, an IR-MA algorithm may be used for detection.
For each pixel point in the first image, the pixel abnormity detection result of the pixel point represents whether the pixel point changes relative to the pixel point at the same position in the second image. The position of the pixel point refers to the coordinate of the pixel point in the image, and the condition that the pixel point in the first image is the same as the pixel point in the second image refers to the condition that the coordinates of the two pixel points in the respective images are the same. It is understood that the data form of the pixel abnormality detection result may be defined in advance; for example, 1 is used to indicate that a pixel point is changed relative to a pixel point at the same position in the reference image, and 0 is used to indicate that no change occurs relative to a pixel point at the same position in the reference image; for another example, 1 is used to indicate that the pixel point is changed relative to the pixel point at the same position in the reference image, and-1 is used to indicate that the pixel point at the same position in the reference image is not changed.
In a specific embodiment, before performing pixel-level anomaly detection on the first image and the second image, image registration may also be performed on the first image and the second image, so that the first image and the second image correspond to each other in spatial displacement, angle and size, that is, pixel points at the same position in the first image and the second image also correspond to the same point in an actual spatial region. The manner of image registration is not limited in this embodiment.
In a specific embodiment, before the pixel-level abnormality detection is performed on the first image and the second image, color correction may be performed on the first image and the second image.
Step S20, searching a dense change area with the change density meeting a preset density condition in the first image, wherein the change density of the area is calculated according to the pixel abnormity detection result;
after the pixel abnormality detection results corresponding to the respective pixel points in the first image are obtained, a region (hereinafter referred to as a dense change region to show differentiation) whose change density meets a preset density condition may be searched for in the first image.
The change density of a certain region in the first image may be calculated according to the pixel abnormality detection result of each pixel point in the first image, and the specific calculation manner is not limited in this embodiment and may be set as needed. For example, in one embodiment, the varying density of the regions may be set as: when more pixels which are changed in the area are, fewer pixels which are not changed are, fewer pixels which are changed outside the area are, and more pixels which are not changed are, the greater the change intensity of the area is; at this time, the calculation manner of the change intensity may be set as: subtracting the number of pixels which do not change in the region from the number of pixels which change in the region to obtain a first difference value, subtracting the number of pixels which do not change in the region from the number of pixels which change in the region to obtain a second difference value, and taking the result of subtracting the second difference value from the first difference value as the change intensity of the region; or, if the pixel abnormity detection result of the set pixel point indicates that the pixel point changes when the pixel abnormity detection result is 1, and indicates that the pixel point does not change when the pixel abnormity detection result is-1, the calculation mode of the change density at this time can be set as: and subtracting the sum of the pixel abnormity detection results of all the pixel points outside the region from the sum of the pixel abnormity detection results of all the pixel points inside the region, and taking the obtained result as the change density of the region.
The preset density condition may be set as needed, and is not limited in this embodiment, for example, the density of the area needs to be greater than a certain threshold.
It can be understood that by setting the preset intensity condition and the calculation mode of the variation intensity, the found dense variation region is a region where variation is concentrated in the first image, so as to filter out the variation regions of small blocks, which are dispersed and not concentrated due to noise in the pixel level detection result.
The shape of the dense change region is not limited in this embodiment, and may be, for example, a rectangle, a pentagon, a hexagon, or the like.
The manner of finding the densely changed area in the first image is not limited in this embodiment. For example, in one embodiment, the search mode may be: initializing a detection frame with a certain shape and a certain size, sliding in a traversal mode in the first image, calculating the change density of a fixed area framed by the detection frame every time of sliding, and detecting whether the change density meets a preset density condition, for example, whether the change density is greater than a certain threshold value; if the preset density condition is met, taking the area framed by the detection frame as a dense change area; after traversing, a detection frame with another shape and/or another size can be initialized again, and the dense change area is searched continuously in a traversing manner; and stopping searching the dense change areas until the quantity of the searched dense change areas reaches a set upper limit, or the number of traversal times reaches the set upper limit, or other set circulation stop conditions are reached.
It should be noted that although the calculation manner of the change density of the area is defined in advance, according to the different set search manners, in the search process, the step of calculating the change density of the area according to the calculation manner may be executed, or the step of calculating the change density of the area according to the calculation manner may not be executed, that is, the area with the change density meeting the preset density condition may be searched in a manner of indirectly calculating the change density of the area.
Step S30, using the dense change region as a region abnormality detection result between the first image and the second image.
After the dense variation region is found, the dense variation region may be used as an abnormality detection result between the first image and the second image (referred to as a region abnormality detection result to distinguish from the above-described pixel abnormality detection result). That is, the densely changing area is an area where the detected first image has changed from the second image.
In a specific embodiment, the area abnormality detection result may be output as needed. For example, output to a display page for presentation, e.g., output to the next algorithm stage in the application scenario, and so on.
Further, in an embodiment, the pixel anomaly detection result of each pixel point in the first image may be converted into an image (referred to as a pixel level label graph), in the pixel level label graph, the pixel point that changes is white, and the pixel point that does not change is black, an intensive change region is marked in the pixel level label graph, and the marked image is output and displayed, so that a user can know the region with more intensive change, and can also know the pixel anomaly detection result that may belong to noise. For example, fig. 3 exemplarily shows a schematic diagram of a pixel-level label graph with densely changed regions, where the regions framed by gray frames are the densely changed regions.
Further, in an embodiment, after obtaining the dense change region, the dense change region may be labeled in the first image, and the labeled first image is output, so that the user may intuitively know the region of the first image where the change is dense without being affected by the fine noise.
In this embodiment, pixel-level anomaly detection is performed on a first image and a second image to obtain pixel anomaly detection results corresponding to each pixel point in the first image, where the pixel anomaly detection results represent whether the pixel point corresponding to the first image changes relative to the pixel point at the same position in the second image; searching a dense change area with the change density meeting a preset density condition in the first image, wherein the change density of the area is calculated according to a pixel abnormity detection result; and taking the dense change area as an area abnormity detection result between the first image and the second image. In the embodiment, based on the abnormal detection result of the pixel level, the dense change area in the image, the change density of which meets the preset density condition, is further searched, the area in which the change is concentrated in the image can be searched as the abnormal detection result, and the change area of the dispersion, the non-concentration and the small blocks caused by the noise in the abnormal detection result of the pixel level is filtered, so that the robustness of the abnormal detection result of the image on the noise is improved, and the reference value of the abnormal detection result of the image is further improved.
Based on the first embodiment described above, the image abnormality detection method according to the second embodiment of the present invention is proposed in this embodiment, where the step S20 includes:
step S201, initializing a position parameter of a detection frame in the first image;
in this embodiment, a method of optimizing an objective function is provided to search for a dense change area, and compared with a traversal search method, accuracy and efficiency of search can be improved.
Specifically, an objective function may be constructed with the change density of the area framed by the maximum detection frame in the first image as a target according to the definition of the change density of the area, and the position parameter of the detection frame in the first image is taken as a parameter to be optimized in the objective function, that is, the change density of the area framed by the detection frame is maximized by finding the optimal position parameter. It should be noted that although the objective function is constructed with the goal of maximizing the variation density of the region framed in the first image of the detection frame, in order to find the optimal parameter, the maximized goal may be converted into a function value of the minimized objective function by setting the objective function, for example, the objective function is set to be a negative number for calculating the variation density in the region framed by the detection frame.
The position parameter of the detection frame is a parameter that can define the position of the detection frame in the first image, and specifically, which parameters are included may be set according to needs, which is not limited in this embodiment. For example, when the detection frame is defined as a rectangular frame, the position parameter of the detection frame may be the coordinates of its lower left corner vertex and upper right corner vertex in the first image, or may be set as a reduction coefficient of the coordinates of the lower left corner vertex of the detection frame in the first image and the height and width of the detection frame with respect to the height and width of the first image, respectively.
In this embodiment, no limitation is imposed on what function is specifically adopted for the objective function. For example, in an embodiment, the objective function setting may be calculated by subtracting a second difference from a first difference, where the first difference is a result obtained by subtracting the number of pixels that have changed from the number of pixels that have not changed in the first image, and the second difference is a result obtained by subtracting the number of pixels that have changed from the number of pixels that have not changed in the region framed by the detection frame; at this time, the region with the greatest change density is found by searching the position parameter with the minimized objective function as the target.
According to the constructed objective function, at least one round of optimization can be carried out on the detection parameters.
Before the first round of optimization, the position parameters of the detection frame in the first image may be initialized. The initialization may be random initialization or initialization to a preset default value, which is not limited in this embodiment.
Step S202, calculating a gradient value of an objective function relative to the position parameter according to the pixel abnormity detection result, wherein the objective function is a function constructed by taking the maximum change density of the framed area of the detection frame in the first image as a target;
it should be noted that, after the objective function is constructed, a calculation formula of the gradient value of the objective function relative to the position parameter of the detection frame may be derived, and after the position parameter of the detection frame is initialized, the pixel abnormality detection result of each pixel point in the first image and the initialized position parameter are substituted into the calculation formula, so as to calculate the gradient value of the objective function relative to the position parameter in the current round of optimization. It should be noted that, when there are multiple position parameters, gradient values of the objective function with respect to the position parameters are calculated respectively.
When the objective functions are different, the derived calculation formulas of the gradient values are also different, and the calculation formulas of the gradient values are not limited in this embodiment.
Step S203, updating the position parameter according to the gradient value, and detecting whether the updated position parameter meets a preset convergence condition;
after the gradient value corresponding to the position parameter is obtained through calculation, the position parameter can be updated according to the gradient value. The manner of updating the position parameter according to the gradient value is not limited in the embodiment, and for example, the position parameter may be updated by using a gradient descent algorithm. For example, the position parameter is represented by P, and the corresponding gradient value is represented by G P Representing, P can be updated as follows:
Figure 287449DEST_PATH_IMAGE001
where ε is the preset optimization step size.
After updating the position parameter, it may be detected whether the updated position parameter meets a preset convergence condition. The preset convergence condition may be set in advance as needed, for example, the change amount of the updated position parameter compared with the position parameter before updating may be set to be smaller than a set threshold.
Step S204, if the position parameters are matched with the first image, taking the area framed by the detection frame in the first image after the position parameters are updated as an intensive change area;
if it is detected that the updated position parameter meets the preset convergence condition, an area framed by the detection frame after updating the position parameter in the first image may be used as the dense change area.
Step S205, if not, returning to execute the step S202 based on the updated location parameter.
If it is detected that the updated position parameter does not meet the preset convergence condition, it indicates that the change density of the detection frame whose position parameter is updated in the area framed in the first image is not yet maximized, and at this time, the step S202 may be returned to, that is, the next round of optimization is performed, based on the updated position parameter. It can be understood that, in the next round of optimization, the initialized position parameter is not substituted into the calculation formula of the gradient value to calculate the gradient value, but the position parameter updated in the current round of optimization is substituted into the calculation formula to calculate the gradient value, and similarly, in the ith round of optimization, the position parameter updated in the ith-1 round of optimization is substituted into the calculation formula to calculate the gradient value.
Further, in an embodiment, the step S201 includes:
step S2011, initializing position parameters of the detection frame in the first image, where the position parameters include a first size transformation parameter, a second size transformation parameter, and an abscissa and an ordinate of a vertex of a lower left corner of the detection frame in the first image, and the first size transformation parameter and the second size transformation parameter are respectively used to transform the width and the height of the first image to obtain the size of the detection frame.
In this embodiment, to further improve the efficiency of searching for the densely-changed area, the detection frame may be set to be a rectangular frame, and the position parameter may be set to include the first size conversion parameter, the second size conversion parameter, and the abscissa and the ordinate of the vertex of the sitting corner of the detection frame in the first image. When the area framed by the detection frame in the first image is determined, the width of the detection frame can be obtained by multiplying the width of the first image by the first size conversion parameter, and the height of the detection frame can be obtained by multiplying the height of the first image by the second size conversion parameter.
Further, under the setting that the detection frame is a rectangular frame, in an embodiment, a construction method of an objective function and a calculation formula of gradient values of the objective function relative to position parameters are provided.
Specifically, the value range of the pixel anomaly detection result may be set to a first value and a second value, that is, for each pixel point in the first image, the pixel anomaly detection result of the pixel point may be the first value or the second value. The first numerical value is a positive number, the second numerical value is a negative number, and the absolute value of the first numerical value is the same as that of the second numerical value; when the pixel abnormity detection result of a certain pixel point in the first image is a first numerical value, the pixel abnormity detection result shows that the pixel point in the first image has no change relative to the pixel point at the same position in the second image, and when the pixel abnormity detection result shows that the pixel point in the first image has a change relative to the pixel point at the same position in the second image.
The definition of the varying density of the regions may be: and subtracting the sum of the pixel abnormity detection results of all the pixel points outside the region from the sum of the pixel abnormity detection results of all the pixel points inside the region, and taking the obtained result as the change intensity of the region. According to this definition, the objective function F (p) can be constructed as:
Figure 149094DEST_PATH_IMAGE002
wherein p = [ p ] 1 ,p 2 ,p 3 ,p 4 ]Wherein p is 1 Is a first size transformation parameter, p 2 Is a second size transformation parameter, p 3 And p 4 Respectively, the abscissa and the ordinate of the vertex of the lower left corner of the detection box in the first image. W (x; p) denotes that x = [ x, y =]More detailed calculations can be tabulated based on the changed coordinates of parameter pShown as follows:
W(x;p)=[p 1 *x+p 2 ,p 3 *y+p 4 ]
i (x, y) represents a pixel abnormality detection result of a pixel point having coordinates (x, y) in the first image. c and r represent the width and height of the first image, respectively. t may be replaced by x and u may be replaced by y.
The left side of the reduction sign of the objective function represents the sum of the pixel abnormity detection results of the pixels of the first image full image, the right side of the reduction sign represents the sum of the pixel abnormity detection results of the pixels in the detection frame, and the multiplication result of 2 is that the left side item is the full image, and the left side item should only be calculated outside the frame according to the definition of the change intensity of the region, so the left side item needs to be subtracted once more.
This objective function is a double-varying-limit integral, and since the left term of the reduction sign is a fixed value and is a constant, it can be ignored when the derivative is obtained, i.e. the objective function can be equivalent to:
Figure 610163DEST_PATH_IMAGE003
in the calculation, the parameters used in the internal and external integrals of the double integral are different, so taking the x direction as an example, the objective function can be directly considered as:
Figure 226958DEST_PATH_IMAGE005
the partial derivative can be calculated for each coefficient to obtain:
Figure 593217DEST_PATH_IMAGE006
wherein, 8706is the partial derivative number symbol
Figure 990045DEST_PATH_IMAGE007
(integral of y, e.g. I y (0) Representing the sum of images I (0, 0) through I (c, 0) to an equivalent one-fold integral, therebyChanging F (p) from a double integral function to a single integral greatly simplifies the optimization function. Thus the x-direction is with respect to p 1 And p 2 (note as p) i ) The derivative of a heavy limit integral of (1) is:
Figure 571199DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 30999DEST_PATH_IMAGE009
the Jacobian matrix for parameter p is specifically:
Figure 946871DEST_PATH_IMAGE010
y-direction dependent parameter p 3 And p 4 (note as p) i ) The derivative of (c) can also be similarly derived:
Figure 150319DEST_PATH_IMAGE011
according to the above parameter p 1 、p 2 、p 3 And p 4 The step S202 may specifically include:
calculating the sum of the pixel abnormity detection results of all pixel points of which the vertical coordinates are the vertical coordinates of the top point of the lower left corner of the detection frame in the first image to obtain a first calculation result;
multiplying the height of the first image by the second size conversion parameter, adding the vertical coordinate of the vertex at the lower left corner of the detection frame to obtain the vertical coordinate of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormity detection results of all pixel points of which the vertical coordinate is the vertical coordinate of the vertex at the upper right corner of the detection frame in the second image to obtain a second calculation result;
subtracting the second calculation result from the first calculation result to obtain a third calculation result, taking the third calculation result as a gradient value of the target function relative to the abscissa of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the third calculation result by the width of the first image as a gradient value of the target function relative to the first size transformation parameter;
calculating the sum of the pixel abnormity detection results of all pixel points of which the abscissa in the first image is the abscissa of the vertex at the lower left corner of the detection frame to obtain a fourth calculation result;
multiplying the width of the first image by the first size conversion parameter, adding the abscissa of the vertex at the lower left corner of the detection frame to obtain the abscissa of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormality detection results of all pixel points of which the abscissa in the second image is the abscissa of the vertex at the upper right corner of the detection frame to obtain a fifth calculation result;
and subtracting the fifth calculation result from the fourth calculation result to obtain a sixth calculation result, taking the sixth calculation result as a gradient value of the target function relative to the ordinate of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the sixth calculation result by the height of the first image as a gradient value of the target function relative to the second size transformation parameter.
Further, in an embodiment, after the step S204, the method further includes:
step S206, after the pixel anomaly detection result of each pixel point in the dense change area in the first image is reset to 0, the step S201 is executed again.
After finding out an intensive change area, the pixel anomaly detection result of each pixel point in the intensive change area in the first image may be reset to 0, and then the step S201 is executed again, that is, the intensive change area is found again, and since the pixel anomaly detection result of the pixel point in the intensive change area that has been found is reset to 0, the intensive change area that is newly found is different from the intensive change area that has been found previously, and a plurality of areas with intensive change in the first image may be found by this way.
In particular embodiments, a condition for ending the loop may be set, for example, stopping the search after a set number of densely changed regions are found.
Further, based on the first embodiment and/or the second embodiment, a third embodiment of the image abnormality detection method of the present invention is provided, and in this embodiment, the step S10 includes:
step S101, after a first image and a second image are obtained, one of the first image and the second image is used as a reference image, and the other image is used as an image to be transformed;
after the first and second images are acquired, image registration may be performed on the first and second images. In this embodiment, one of the first image and the second image may be used as a reference image, and the other may be used as an image to be changed. It should be noted that, in a specific application scenario, it may be determined whether the first image is used as the reference image or the second image is used as the reference image according to a need, which is not limited in this embodiment.
Step S102, extracting feature matching point pairs in the reference image and the image to be transformed;
and extracting feature matching point pairs in the reference image and the image to be transformed. The number of the feature matching point pairs can be three, and each pair of feature matching point pairs comprises one pixel point in the reference image and one pixel point in the image to be transformed. In this embodiment, the extraction method of the feature matching point pair is not limited.
For example, in an embodiment, feature point detection may be performed on the reference image and the image to be transformed, respectively, to obtain feature vectors corresponding to the feature points. The method of performing feature point detection is not limited herein, and for example, scale-invariant feature transform (SIFT-invariant feature transform) feature points may be detected. And 2-neighbor matching is carried out on each feature point in the reference image in the image to be transformed by adopting a violence matching mode, namely two feature points which are most similar to the feature point (the Euclidean distance between feature vectors is minimum) are selected from the image to be transformed. And for each feature point in the reference image, setting Euclidean distances of feature vectors between the feature point and two feature points most similar to the feature point as d1 and d2, setting a threshold value tau, and if d1/d2 is less than tau, taking the feature point and one feature point most similar to the feature point as a pair of feature matching point pairs.
Step S103, calculating an affine transformation matrix of the image to be transformed relative to the reference image according to the feature matching point pairs;
step S104, transforming the image to be transformed according to the affine transformation matrix to obtain a registration image;
after the feature matching point pairs are extracted, an affine transformation matrix of the image to be transformed with respect to the reference image may be calculated according to the feature matching point pairs, and the image to be transformed is transformed according to the affine transformation matrix, and the transformed image is hereinafter referred to as a registration image for distinction. Affine Transformation (Affine Transformation) refers to a process of performing linear Transformation (multiplication by a matrix) and translation (addition of a vector) once in a vector space, and transforming to another vector space. In this embodiment, the way of calculating the affine transformation matrix and the way of performing affine transformation on the image to be transformed are not limited.
Step S105, pixel-level anomaly detection is carried out on the reference image and the registration image, and pixel anomaly detection results corresponding to all pixel points in the first image are obtained.
After the reference image and the registration image are obtained, pixel-level anomaly detection can be performed on the reference image and the registration image, and pixel anomaly detection results corresponding to all pixel points in the first image are obtained. It should be noted that, when the first image is taken as the image to be converted, after the first image is converted, the first images involved in the subsequent steps all refer to the converted first image, that is, obtaining the pixel abnormality detection result corresponding to each pixel point in the first image specifically refers to obtaining the pixel abnormality detection result corresponding to each pixel point in the converted first image.
In addition, an embodiment of the present invention further provides an image anomaly detection apparatus, and with reference to fig. 4, the image anomaly detection apparatus includes:
the detection module 10 is configured to perform pixel-level anomaly detection on a first image and a second image to obtain pixel anomaly detection results corresponding to each pixel point in the first image, where the pixel anomaly detection results represent whether the corresponding pixel point in the first image changes relative to a pixel point at the same position in the second image;
a searching module 20, configured to search a dense variation region in which the variation density meets a preset density condition in the first image, where the variation density of the region is calculated according to the pixel anomaly detection result;
a determining module 30, configured to use the dense change area as an area anomaly detection result between the first image and the second image.
Further, the lookup module 20 is further configured to:
initializing a position parameter of a detection frame in the first image;
calculating a gradient value of an objective function relative to the position parameter according to the pixel anomaly detection result, wherein the objective function is a function constructed by taking the maximum change intensity of the framed area of the detection frame in the first image as a target;
updating the position parameters according to the gradient values, and detecting whether the updated position parameters meet preset convergence conditions or not;
if so, taking the area framed by the detection frame in the first image after updating the position parameters as an intensive change area;
and if not, returning to execute the operation of calculating the gradient value of the target function relative to the position parameter according to the pixel abnormity detection result based on the updated position parameter.
Further, the detection frame is a rectangular frame, and the search module 20 is further configured to:
initializing position parameters of the detection frame in the first image, wherein the position parameters comprise a first size transformation parameter, a second size transformation parameter and an abscissa and an ordinate of a vertex of a lower left corner of the detection frame in the first image, and the first size transformation parameter and the second size transformation parameter are respectively used for transforming the width and the height of the first image to obtain the size of the detection frame.
Further, the value range of the pixel anomaly detection result is a first numerical value and a second numerical value, the first numerical value is a positive number, the second numerical value is a negative number, the absolute value of the first numerical value is the same as the absolute value of the second numerical value, the first numerical value represents that the corresponding pixel point in the first image has no change relative to the pixel point at the same position in the second image, and the second numerical value represents that the corresponding pixel point in the first image has a change relative to the pixel point at the same position in the second image;
the lookup module 20 is further configured to:
calculating the sum of the pixel abnormity detection results of all pixel points of which the vertical coordinates are the vertical coordinates of the top point of the lower left corner of the detection frame in the first image to obtain a first calculation result;
multiplying the height of the first image by the second size conversion parameter, adding the vertical coordinate of the vertex at the lower left corner of the detection frame to obtain the vertical coordinate of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormity detection results of all pixel points of which the vertical coordinate is the vertical coordinate of the vertex at the upper right corner of the detection frame in the second image to obtain a second calculation result;
subtracting the second calculation result from the first calculation result to obtain a third calculation result, taking the third calculation result as a gradient value of the target function relative to the abscissa of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the third calculation result by the width of the first image as a gradient value of the target function relative to the first size transformation parameter;
calculating the sum of the pixel abnormity detection results of all pixel points of which the abscissa in the first image is the abscissa of the vertex at the lower left corner of the detection frame to obtain a fourth calculation result;
multiplying the width of the first image by the first size conversion parameter, adding the abscissa of the vertex at the lower left corner of the detection frame to obtain the abscissa of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormality detection results of all pixel points of which the abscissa in the second image is the abscissa of the vertex at the upper right corner of the detection frame to obtain a fifth calculation result;
and subtracting the fifth calculation result from the fourth calculation result to obtain a sixth calculation result, taking the sixth calculation result as a gradient value of the target function relative to the ordinate of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the sixth calculation result by the height of the first image as a gradient value of the target function relative to the second size transformation parameter.
Further, the lookup module 20 is further configured to:
and after the pixel abnormity detection result of each pixel point in the dense change area in the first image is reset to 0, returning to execute the operation of initializing the position parameter of the detection frame in the first image.
Further, the detection module 10 is further configured to:
after a first image and a second image are obtained, taking one of the first image and the second image as a reference image and taking the other image as an image to be transformed;
extracting feature matching point pairs in the reference image and the image to be transformed;
calculating an affine transformation matrix of the image to be transformed relative to the reference image according to the feature matching point pairs;
transforming the image to be transformed according to the affine transformation matrix to obtain a registration image;
and carrying out pixel-level anomaly detection on the reference image and the registration image to obtain pixel anomaly detection results corresponding to all pixel points in the first image.
Further, the image abnormality detection apparatus further includes:
and the output module is used for labeling the dense change area in the first image and outputting the labeled first image.
The specific implementation of the image anomaly detection apparatus of the present invention is basically the same as the embodiments of the image anomaly detection method, and is not described herein again.
Furthermore, an embodiment of the present invention further provides a computer-readable storage medium, where an image anomaly detection program is stored, and when the image anomaly detection program is executed by a processor, the image anomaly detection program implements the steps of the image anomaly detection method as described below.
The embodiments of the image anomaly detection device and the computer-readable storage medium of the present invention can refer to the embodiments of the image anomaly detection method of the present invention, and are not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one of 8230, and" comprising 8230does not exclude the presence of additional like elements in a process, method, article, or apparatus comprising the element.
The above-mentioned serial numbers of the embodiments of the present invention are only for description, and do not represent the advantages and disadvantages of the embodiments.
Through the above description of the embodiments, it is clear to those skilled in the art that the above embodiment method can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all equivalent structures or equivalent processes performed by the present invention or directly or indirectly applied to other related technical fields are also included in the scope of the present invention.

Claims (7)

1. An image abnormality detection method characterized by comprising the steps of:
performing pixel-level anomaly detection on a first image and a second image to obtain pixel anomaly detection results corresponding to all pixel points in the first image, wherein the pixel anomaly detection results represent whether the corresponding pixel points in the first image change relative to the pixel points at the same position in the second image;
searching a dense change area with the change density meeting a preset density condition in the first image, wherein the change density of the area is calculated according to the pixel abnormity detection result;
taking the dense change region as a region abnormality detection result between the first image and the second image;
the step of searching for the dense variation region with the variation density meeting the preset density condition in the first image comprises the following steps:
initializing a position parameter of a detection frame in the first image;
calculating a gradient value of an objective function relative to the position parameter according to the pixel abnormity detection result, wherein the objective function is a function constructed by taking the maximum change intensity of the framed area of the detection frame in the first image as a target;
updating the position parameter according to the gradient value, and detecting whether the updated position parameter meets a preset convergence condition;
if so, taking the area framed by the detection frame in the first image after updating the position parameters as an intensive change area;
if not, returning to execute the step of calculating the gradient value of the target function relative to the position parameter according to the pixel abnormity detection result based on the updated position parameter;
the detection frame is a rectangular frame, and the step of initializing the position parameters of the detection frame in the first image comprises the following steps:
initializing position parameters of the detection frame in the first image, wherein the position parameters comprise a first size transformation parameter, a second size transformation parameter and an abscissa and an ordinate of a vertex of a lower left corner of the detection frame in the first image, and the first size transformation parameter and the second size transformation parameter are respectively used for transforming the width and the height of the first image to obtain the size of the detection frame;
the value range of the pixel abnormity detection result is a first value and a second value, the first value is a positive number, the second value is a negative number, the absolute value of the first value is the same as the absolute value of the second value, the first value indicates that the corresponding pixel point in the first image has no change relative to the pixel point at the same position in the second image, and the second value indicates that the corresponding pixel point in the first image has a change relative to the pixel point at the same position in the second image;
the step of calculating a gradient value of an objective function with respect to the position parameter according to the pixel abnormality detection result includes:
calculating the sum of the pixel abnormity detection results of all pixel points of which the vertical coordinates are the vertical coordinates of the top point of the lower left corner of the detection frame in the first image to obtain a first calculation result;
multiplying the height of the first image by the second size conversion parameter, adding the vertical coordinate of the vertex at the lower left corner of the detection frame to obtain the vertical coordinate of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormity detection results of all pixel points of which the vertical coordinate is the vertical coordinate of the vertex at the upper right corner of the detection frame in the second image to obtain a second calculation result;
subtracting the second calculation result from the first calculation result to obtain a third calculation result, taking the third calculation result as a gradient value of the target function relative to the abscissa of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the third calculation result by the width of the first image as a gradient value of the target function relative to the first size transformation parameter;
calculating the sum of the pixel abnormity detection results of all pixel points of which the abscissa in the first image is the abscissa of the vertex at the lower left corner of the detection frame to obtain a fourth calculation result;
multiplying the width of the first image by the first size conversion parameter, adding the abscissa of the vertex at the lower left corner of the detection frame to obtain the abscissa of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormality detection results of all pixel points of which the abscissa in the second image is the abscissa of the vertex at the upper right corner of the detection frame to obtain a fifth calculation result;
and subtracting the fifth calculation result from the fourth calculation result to obtain a sixth calculation result, taking the sixth calculation result as a gradient value of the target function relative to the ordinate of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the sixth calculation result by the height of the first image as a gradient value of the target function relative to the second size transformation parameter.
2. The method for detecting an image abnormality according to claim 1, wherein said step of setting the detection frame after updating the position parameter to the region framed in the first image as a densely-changed region further comprises:
and after the pixel abnormity detection result of each pixel point in the dense change area in the first image is reset to 0, returning to the step of executing the position parameter of the initialization detection frame in the first image.
3. The image anomaly detection method according to claim 1, wherein the step of performing pixel-level anomaly detection on the first image and the second image to obtain pixel anomaly detection results corresponding to respective pixel points in the first image comprises:
after a first image and a second image are obtained, taking one of the first image and the second image as a reference image and taking the other image as an image to be transformed;
extracting feature matching point pairs in the reference image and the image to be transformed;
calculating an affine transformation matrix of the image to be transformed relative to the reference image according to the feature matching point pairs;
transforming the image to be transformed according to the affine transformation matrix to obtain a registration image;
and performing pixel-level anomaly detection on the reference image and the registration image to obtain pixel anomaly detection results corresponding to all pixel points in the first image.
4. The image abnormality detection method according to any one of claims 1 to 3, characterized in that, after the step of regarding the densely changing area as an area abnormality detection result between the first image and the second image, further comprising:
and labeling the dense change area in the first image, and outputting the labeled first image.
5. An image abnormality detection apparatus characterized by comprising:
the detection module is used for performing pixel-level anomaly detection on a first image and a second image to obtain pixel anomaly detection results corresponding to all pixel points in the first image, wherein the pixel anomaly detection results represent whether the corresponding pixel points in the first image change relative to the pixel points at the same position in the second image;
the searching module is used for searching a dense change area with the change density meeting a preset density condition in the first image, wherein the change density of the area is calculated according to the pixel abnormity detection result;
a determination module, configured to use the dense change area as an area anomaly detection result between the first image and the second image;
the lookup module is further to:
initializing a position parameter of a detection frame in the first image;
calculating a gradient value of an objective function relative to the position parameter according to the pixel abnormity detection result, wherein the objective function is a function constructed by taking the maximum change intensity of the framed area of the detection frame in the first image as a target;
updating the position parameter according to the gradient value, and detecting whether the updated position parameter meets a preset convergence condition;
if so, taking the area framed by the detection frame in the first image after updating the position parameters as an intensive change area;
if not, returning to execute the operation of calculating the gradient value of the target function relative to the position parameter according to the pixel abnormity detection result based on the updated position parameter;
the detection frame is a rectangular frame, and the search module is further configured to:
initializing position parameters of the detection frame in the first image, wherein the position parameters comprise a first size transformation parameter, a second size transformation parameter and an abscissa and an ordinate of a vertex of a lower left corner of the detection frame in the first image, and the first size transformation parameter and the second size transformation parameter are respectively used for transforming the width and the height of the first image to obtain the size of the detection frame;
the value range of the pixel abnormity detection result is a first numerical value and a second numerical value, the first numerical value is a positive number, the second numerical value is a negative number, the absolute value of the first numerical value is the same as the absolute value of the second numerical value, the first numerical value represents that the corresponding pixel point in the first image is unchanged relative to the pixel point at the same position in the second image, and the second numerical value represents that the corresponding pixel point in the first image is changed relative to the pixel point at the same position in the second image;
the lookup module is further to:
calculating the sum of the pixel abnormity detection results of all pixel points of which the vertical coordinates are the vertical coordinates of the top point of the lower left corner of the detection frame in the first image to obtain a first calculation result;
multiplying the height of the first image by the second size conversion parameter, adding the vertical coordinate of the vertex at the lower left corner of the detection frame to obtain the vertical coordinate of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormity detection results of all pixels of which the vertical coordinate is the vertical coordinate of the vertex at the upper right corner of the detection frame in the second image to obtain a second calculation result;
subtracting the second calculation result from the first calculation result to obtain a third calculation result, taking the third calculation result as a gradient value of the target function relative to the abscissa of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the third calculation result by the width of the first image as a gradient value of the target function relative to the first size transformation parameter;
calculating the sum of the pixel abnormity detection results of all pixel points of which the abscissa in the first image is the abscissa of the vertex at the lower left corner of the detection frame to obtain a fourth calculation result;
multiplying the width of the first image by the first size conversion parameter, adding the abscissa of the vertex at the lower left corner of the detection frame to obtain the abscissa of the vertex at the upper right corner of the detection frame, and calculating the sum of the pixel abnormality detection results of all pixel points of which the abscissa in the second image is the abscissa of the vertex at the upper right corner of the detection frame to obtain a fifth calculation result;
and subtracting the fifth calculation result from the fourth calculation result to obtain a sixth calculation result, taking the sixth calculation result as a gradient value of the target function relative to the ordinate of the vertex at the lower left corner of the detection frame, and taking a result obtained by multiplying the sixth calculation result by the height of the first image as a gradient value of the target function relative to the second size transformation parameter.
6. An image abnormality detection apparatus characterized by comprising: a memory, a processor and an image anomaly detection program stored on the memory and executable on the processor, the image anomaly detection program when executed by the processor implementing the steps of the image anomaly detection method as claimed in any one of claims 1 to 4.
7. A computer-readable storage medium, characterized in that an image abnormality detection program is stored thereon, which when executed by a processor implements the steps of the image abnormality detection method according to any one of claims 1 to 4.
CN202211472510.XA 2022-11-23 2022-11-23 Image anomaly detection method, device, equipment and computer readable storage medium Active CN115578382B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211472510.XA CN115578382B (en) 2022-11-23 2022-11-23 Image anomaly detection method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211472510.XA CN115578382B (en) 2022-11-23 2022-11-23 Image anomaly detection method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN115578382A CN115578382A (en) 2023-01-06
CN115578382B true CN115578382B (en) 2023-03-07

Family

ID=84590027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211472510.XA Active CN115578382B (en) 2022-11-23 2022-11-23 Image anomaly detection method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN115578382B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105191288A (en) * 2012-12-31 2015-12-23 菲力尔系统公司 Anomalous pixel detection
CN106127124A (en) * 2016-06-17 2016-11-16 长安大学 The automatic testing method of the abnormal image signal in region, taxi front row
CN115115645A (en) * 2022-08-30 2022-09-27 南通林德安全设备科技有限公司 Tongue image greasy characteristic data identification and detection method and system
CN115311299A (en) * 2022-10-12 2022-11-08 如皋市金轶纺织有限公司 Textile dyeing abnormity detection method based on multiple light source angles

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3432265A4 (en) * 2016-03-14 2019-03-20 Ricoh Company, Ltd. Image processing device, apparatus control system, image pickup device, image processing method, and program
JP7120250B2 (en) * 2017-03-31 2022-08-17 横河電機株式会社 Image-based anomaly detection method and system
CN109978037B (en) * 2019-03-18 2021-08-06 腾讯科技(深圳)有限公司 Image processing method, model training method, device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105191288A (en) * 2012-12-31 2015-12-23 菲力尔系统公司 Anomalous pixel detection
CN106127124A (en) * 2016-06-17 2016-11-16 长安大学 The automatic testing method of the abnormal image signal in region, taxi front row
CN115115645A (en) * 2022-08-30 2022-09-27 南通林德安全设备科技有限公司 Tongue image greasy characteristic data identification and detection method and system
CN115311299A (en) * 2022-10-12 2022-11-08 如皋市金轶纺织有限公司 Textile dyeing abnormity detection method based on multiple light source angles

Also Published As

Publication number Publication date
CN115578382A (en) 2023-01-06

Similar Documents

Publication Publication Date Title
CN109791695B (en) Determining a variance of an image block based on a motion vector of the block
KR101795823B1 (en) Text enhancement of a textual image undergoing optical character recognition
US11113795B2 (en) Image edge processing method, electronic device, and computer readable storage medium
US9691155B2 (en) Image processing apparatus and image processing method for finding background regions in an image
CN111161222B (en) Printing roller defect detection method based on visual saliency
JP5483961B2 (en) Image processing apparatus, subject discrimination method, program, and storage medium
JPWO2007105359A1 (en) Image processing apparatus, image processing program, electronic camera, and image processing method for image analysis of lateral chromatic aberration
US11062464B2 (en) Image processing apparatus, method, and storage medium to derive optical flow
CN109767453B9 (en) Information processing apparatus, background image updating method, and non-transitory computer-readable storage medium
CN107786780B (en) Video image noise reduction method and device and computer readable storage medium
CN109214996B (en) Image processing method and device
CN113222921A (en) Image processing method and system
US8942478B2 (en) Information processing apparatus, processing method therefor, and non-transitory computer-readable storage medium
CN109035167B (en) Method, device, equipment and medium for processing multiple faces in image
CN111080665A (en) Image frame identification method, device and equipment and computer storage medium
CN115578382B (en) Image anomaly detection method, device, equipment and computer readable storage medium
CN111080683B (en) Image processing method, device, storage medium and electronic equipment
Thomas et al. Color balancing for change detection in multitemporal images
CN115049713B (en) Image registration method, device, equipment and readable storage medium
WO2024016632A1 (en) Bright spot location method, bright spot location apparatus, electronic device and storage medium
JP2005339535A (en) Calculation of dissimilarity measure
JP5928465B2 (en) Degradation restoration system, degradation restoration method and program
CN113516609A (en) Split screen video detection method and device, computer equipment and storage medium
CN111754417A (en) Noise reduction method and device for video image, video matting method and device and electronic system
KR101212316B1 (en) Movement path extracting method of center points of frame using spatial transformation and motion estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant