CN116311079B - Civil security engineering monitoring method based on computer vision - Google Patents
Civil security engineering monitoring method based on computer vision Download PDFInfo
- Publication number
- CN116311079B CN116311079B CN202310530853.5A CN202310530853A CN116311079B CN 116311079 B CN116311079 B CN 116311079B CN 202310530853 A CN202310530853 A CN 202310530853A CN 116311079 B CN116311079 B CN 116311079B
- Authority
- CN
- China
- Prior art keywords
- edge
- local
- path
- target
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 75
- 238000000034 method Methods 0.000 title claims abstract description 42
- 238000001914 filtration Methods 0.000 claims abstract description 37
- 238000012545 processing Methods 0.000 claims abstract description 7
- 238000010606 normalization Methods 0.000 claims description 14
- 238000004458 analytical method Methods 0.000 claims description 6
- 230000011218 segmentation Effects 0.000 claims description 6
- 238000012937 correction Methods 0.000 claims description 4
- 230000002708 enhancing effect Effects 0.000 claims description 2
- 239000000779 smoke Substances 0.000 description 18
- 238000004364 calculation method Methods 0.000 description 7
- 238000013527 convolutional neural network Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000012549 training Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/267—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/36—Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Nonlinear Science (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the technical field of image data processing, in particular to a civil security engineering monitoring method based on computer vision, which comprises the following steps: the method comprises the steps of quantifying and amplifying weak detail features existing in an infrared monitoring image to obtain a plurality of edge areas, obtaining local suspicious areas among the edge areas according to the distance and the distribution direction of pixel points in the edge areas, combining the pixel points to obtain a plurality of paths, obtaining trend distribution features of the paths according to local linear features and local trend features, obtaining final adjustment parameters according to trend distribution feature differences, and improving guide filtering by utilizing the final adjustment parameters. The invention can improve the guide filtering to strengthen the weak edges of the corresponding smog or flame in the infrared monitoring image to a greater extent, and improve the fire monitoring accuracy of the infrared monitoring image.
Description
Technical Field
The invention relates to the technical field of image data processing, in particular to a civil security engineering monitoring method based on computer vision.
Background
In the night monitoring environment, the suspicious conditions in the night are monitored by utilizing infrared monitoring equipment due to the influence of environmental factors such as illumination change, and in the environment of security engineering monitoring, fire disasters are usually required to be monitored or prevented, so that the smoke in the environment is always required to have stronger attention. But for smoke features in the night, the image features of the smoke part, in particular the edge features thereof, are weak, so that an increasing operation by a guided filtering algorithm is often required.
However, in the infrared monitoring image at night, the same phenomenon that the edge characteristics of a partial region are damaged and lost when the gradient is obtained by the edge detection such as a Sobel operator usually occurs, and the conventional guide filtering algorithm cannot deal with the situation, so that the guide filtering local characteristic parameters of the pixel points in the infrared monitoring image are adjusted according to the problem.
Disclosure of Invention
The invention provides a civil security engineering monitoring method based on computer vision, which aims to solve the existing problems.
The civil security engineering monitoring method based on computer vision adopts the following technical scheme:
the invention provides a civil security engineering monitoring method based on computer vision, which comprises the following steps:
acquiring an infrared monitoring image, acquiring a gradient amplitude and a gradient direction, and marking a connected domain formed by pixel points larger than a gradient threshold as an edge region;
the slope directions corresponding to the two pixel points with the largest Euclidean distance in the edge area are marked as area distribution directions, and the slope directions corresponding to the two pixel points with the smallest Euclidean distance in any two edge areas are marked as relative distribution directions; obtaining a confidence coefficient factor according to the difference between the average value of the distribution directions of the areas and the relative distribution direction, and carrying out product correction on the minimum Euclidean distance by the confidence coefficient factor to obtain edge confidence coefficient;
marking any edge area as a target edge area, acquiring an edge area corresponding to the maximum edge confidence of the target edge area, marking the edge area as a second target edge area, acquiring the midpoint of two pixel points of the minimum Euclidean distance between the target edge area and the second target edge area, acquiring a local suspicious area according to the midpoint and the minimum Euclidean distance, and acquiring a plurality of paths from the starting point to the end point by taking the two pixel points as the starting point and the end point;
marking any path as a target path, acquiring the position direction of all pixel points in the target path between the last pixel point and the next pixel point in the path, marking the position direction as a local distribution direction, and acquiring local linear characteristics according to the difference between the local distribution direction and the gradient direction of the target pixel points; marking the difference value of the gradient directions of the pixel points in the target path as gradient direction difference, and obtaining local trend characteristics according to the accumulated and summed result of the gradient direction difference;
obtaining vertical distances and average vertical distances of all pixel points in the target path, forming a straight line with the starting point and the end point, recording the difference value of the vertical distances and the average vertical distances as a basic amplitude value, and obtaining trend distribution characteristics according to the average value of the product correction result of the local linear characteristics and the local trend characteristics on the basic amplitude value; obtaining local detail parameters according to the minimum difference value of the trend distribution characteristics, and recording the product adjustment result of the local detail parameters and the edge confidence as a final adjustment parameter;
and adjusting regularization parameters according to the final adjustment parameters to obtain an improved guided filtering algorithm, and filtering and enhancing the infrared monitoring image.
Further, the edge region is obtained by the following steps:
and acquiring gradient information in the infrared monitoring image by utilizing a Sobel operator, namely acquiring the magnitude and the direction of the corresponding gradient amplitude, acquiring a segmentation threshold value of the gradient amplitude in the infrared monitoring image by utilizing an OTSU algorithm, marking the segmentation threshold value as a gradient threshold value, and carrying out connected domain analysis on all pixel points with the gradient amplitude larger than the gradient threshold value to acquire a plurality of connected domains, and marking the connected domains as edge areas.
Further, the edge confidence coefficient is obtained by the following steps:
recording any edge area in the infrared monitoring image as a target edge area, calculating Euclidean distance of edge pixel points between every two edge pixel points in the target edge area, taking two pixel points with the farthest distance as two end points of the edge area, obtaining the slope of a connecting line of the two end points according to the coordinate positions of the two end points in the infrared monitoring image, and further obtaining a direction corresponding to the slope, wherein the direction is the area distribution direction of the target edge area;
acquiring the average value of the coordinate positions of two pixel points with the farthest Euclidean distance of any edge area in an image, namely the central position of the edge area, and then connecting the central positions of the target edge area and the rest edge areas, wherein the obtained connecting line direction is used as the relative distribution direction of the target edge area and the rest edge areas;
obtaining the shortest Euclidean distance between the pixel points in the target edge area and other edge areas;
the method for acquiring the edge confidence comprises the following steps:
wherein,,represents the ith edge region and the remaining +.>Edge confidence of the individual edge regions, +.>Represents the ith edge region and +.>Shortest Euclidean distance between pixel points in each edge region, < >>Indicates the region distribution direction of the ith edge region,/-, and>indicate->The direction of the area distribution of the individual edge areas, +.>Represents the ith edge region and +.>The relative distribution direction of the individual edge regions.
Further, the path acquisition method is as follows:
obtaining edge confidence coefficient between a target edge region and other edge regions, obtaining maximum edge confidence coefficient between the target edge region and other edge regions, taking the shortest Euclidean distance of an endpoint between the target edge region corresponding to the maximum edge confidence coefficient and the other edge regions as a diameter, taking a midpoint corresponding to the middle of the endpoint as a circle center, setting a circular local region, and recording the circular local region between the two endpoints as a local suspicious region;
in the local suspicious region, two pixel points with the nearest distance between two edge regions are taken as two end points, one of the two end points is arbitrarily selected as a starting point, the other end point is taken as an end point, one pixel point belonging to the local suspicious region is randomly selected as a second point on the path in the eight neighborhood range of the starting point, the operation when the second point is acquired from the starting point is repeated by the second point, the operation is repeated until the end point appears in the neighborhood range after the third point is found, and the first path is acquired at the moment, namelyAnd repeating the operation until all possible paths are obtained, and obtaining a plurality of paths.
Further, the local linear characteristic is obtained by the following steps:
the local linear characteristic acquisition method comprises the following steps:
wherein,,representing the local linear characteristics of the mth pixel point on the target path,/and>representing the local distribution direction of the mth pixel point on the target path,/for>Representing the gradient direction of the mth pixel point on the target path.
Further, the local trend feature is obtained by the following steps:
the local trend feature acquisition method comprises the following steps:
wherein,,representing local trend characteristics of the mth pixel,/->Representing a linear normalization function, ++>Representing the number of pixels on the target path which are close to the local trend of the mth pixel,And the gradient direction difference of the c pixel point and the m pixel point is represented.
Further, the trend distribution characteristics are obtained by the following steps:
and marking any path as a target path, wherein the trend distribution characteristic acquisition method of the target path comprises the following steps:
wherein M represents the number of pixels contained in the target path,represents the vertical distance between the mth pixel point and the connecting straight line in the target path, +.>Representing the average vertical distance between all pixel points in the target path and the connecting straight line,representing the local linear characteristics of the mth pixel point on the target path,/and>representing the local trend feature of the mth pixel point,representing the base amplitude.
Further, the local detail parameters are obtained by the following steps:
respectively obtaining the absolute value of the difference value of the trend distribution characteristics between the nth path and the corresponding edge path 1 and edge path 2, and marking the absolute value as the minimum value of the absolute value as the nth pathFirst feature of the strip path->The first features of all paths are subjected to linear normalization processing, and the first features of the nth path after linear normalization are expressed as +.>;
Local detail parameters of the nth pathThe acquisition method comprises the following steps:
wherein,,absolute value of difference representing trend distribution characteristic between nth path and corresponding edge path 1 and edge path 2, +.>Representing the minimum value of acquisition->Representing a linear normalization function.
Further, the regularization parameters are adjusted according to the final adjustment parameters to obtain an improved guided filtering algorithm, which comprises the following specific steps:
the improved representation method of the local features in the guided filtering algorithm comprises the following steps:
wherein,,represents the ith edge region and +.>In the locally suspicious region of the edge region +.>Improved local feature of individual pixels, < >>Variance representing gray value in infrared monitoring image,/->Representing guide image variance, whereinϵ shows regularization coefficients in the guided filtering algorithm,/->Represents the ith edge region and the remaining +.>Final adjustment parameters of the z-th pixel point in the locally suspicious region of the edge region.
The technical scheme of the invention has the beneficial effects that:
when the conventional guide filtering algorithm carries out filtering enhancement on an infrared monitoring image in a night environment, the original image is used as a template to enhance the edge information in the image, but when part of the edge information in the image is lost due to the small gradient characteristics, part of details are lost, and the conventional guide filtering algorithm has poor effect on the condition, even the weak gradient characteristics are smoothed, so that the information loss degree is larger, and the monitoring and recognition of smoke are influenced. According to the method, weak detail features existing in the infrared monitoring image are quantized and amplified to obtain a plurality of edge areas, an optimal matching target is obtained according to the similarity, a local suspicious area among the edge areas is obtained, a plurality of paths formed by combining all pixel points in the edge areas are obtained, the degree of the pixel points on each path conforming to the linear feature is used as weight, in calculation of trend distribution features of the whole path, standard deviation is corrected according to the local features of the pixel points, trend distribution feature differences of each path and the corresponding edge area are obtained, the trend distribution feature differences are used as adjustment parameters of each pixel point, the enhancement degree of each pixel point in guide filtering is adjusted based on the adjustment parameters, the weaker part of the edge features is enabled to obtain higher enhancement degree in the guide filtering, meanwhile, the defect that smoke is smoothened by mistake due to weaker gradient is avoided, the enhancement effect of the smoke in the infrared monitoring is better, and the accuracy of smoke monitoring is further improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of steps of a civil security engineering monitoring method based on computer vision;
FIG. 2 is a schematic view of an edge area;
fig. 3 is a schematic diagram of the distribution direction.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following detailed description is given below of the computer vision-based civil security engineering monitoring method according to the invention, which is provided by combining the accompanying drawings and the preferred embodiment. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the civil security engineering monitoring method based on computer vision provided by the invention with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of a computer vision-based civil security engineering monitoring method according to an embodiment of the invention is shown, and the method includes the following steps:
and S001, acquiring a monitoring image of the night civil security engineering by using an infrared camera.
Acquiring a monitoring image of civil security engineering through an infrared camera at night, wherein in a period of dark environment, the environmental change of the civil security engineering is important to the security engineering, and important monitoring is required to be carried out on a place where a fire disaster easily occurs, and the acquired monitoring image is input into a trained VGGNet convolutional neural network to output a detection and judgment result of whether smoke and flame appear in the monitoring image;
the training process of the VGGNet convolutional neural network is as follows:
(1) Acquiring a large number of monitoring images, artificially taking whether flames and smoke exist as artificial labels of the monitoring images, taking the monitoring images with the artificial labels as samples, and taking a set formed by a large number of samples as a data set of the VGGNet convolutional neural network;
(2) Training the VGGNet convolutional neural network by utilizing the data set, outputting a judging result of whether flame or smoke exists in the monitoring image, and finishing the training of the VGGNet convolutional neural network;
(3) The VGGNet convolutional neural network after training is used for monitoring civil security engineering, judging whether flame or smoke exists in real time, acquiring a monitoring image suspected to exist in flame or smoke, recording the monitoring image as an infrared monitoring image, and carrying out subsequent analysis processing on the acquired infrared monitoring image.
Step S002, obtaining gradient characteristics by utilizing a Sobel operator, detecting a connected domain to obtain an edge region, and obtaining edge confidence according to the distance between pixel points of the edge region.
Conventional guided filtering algorithms perform a local linear enhancement of edge features based on the edge gradient features that are present in the image itself. However, in the image collected by the infrared monitoring equipment, part of the smoke is tiny and shallow, the image characteristics of the smoke are relatively less obvious, for example, the information such as color, edge and the like is fuzzy, so that the obvious edge characteristics are lost, and the enhancement effect of the smoke during filtering enhancement is poor.
Therefore, in this embodiment, a plurality of edge regions in an image are obtained through gradient features, and corresponding most probable matching results are obtained for each edge region according to local similar features, so as to obtain a local suspicious region in which missing information may occur, and a plurality of paths are determined in the local suspicious region, and each path and its corresponding edge region obtain final trend distribution features through standard deviations of vertical distance distribution between pixel points and distribution directions, local linear features and local trend features of each pixel point, and corresponding filter adjustment parameters are obtained according to the differences.
After gradient information in an infrared monitoring image is obtained through a Sobel operator, after connected domain judgment is carried out through set conditions, most of obtained intermittent parts on smoke edge characteristics are shown in fig. 2, missing information is often between the intermittent parts, but if the area between every two edge areas is considered to be unreasonable, the situation that missing information possibly exists is needed to be calculated, so that an optimal part corresponding to each edge area is found, as shown in fig. 2, the missing edge information adjacent to the edge area 2 is relatively more likely to exist in the middle area of the edge area 1 rather than in the edge area 3, and judgment is carried out based on the situation.
Firstly, acquiring gradient information in an infrared monitoring image by utilizing a Sobel operator, namely acquiring the magnitude and the direction of a corresponding gradient amplitude, acquiring a segmentation threshold value of the gradient amplitude in the infrared monitoring image by utilizing an OTSU algorithm, marking the segmentation threshold value as a gradient threshold value, and carrying out connected domain analysis on all pixel points with the gradient amplitude larger than the gradient threshold value to acquire a plurality of connected domains, and marking the connected domains as edge regions.
For the ith edge area, after obtaining the edge pixel points, calculating the ithThe Euclidean distance between any two edge pixel points in the edge area takes two pixel points with the farthest distance as two end points of the edge area, the slope of the connecting line of the two end points is obtained according to the coordinate positions of the two end points in the image, and the direction corresponding to the slope is further obtained, wherein the direction corresponds to the first->The area distribution direction of the individual edge areas is denoted +.>。
At the same time, the mean value of the coordinate positions of the two end points of the ith edge area in the image is obtained, namely the central position of the ith edge area is obtained, and then the ith edge area is connected with the rest of the ith edge areasThe central position of each edge area, the obtained connecting lineIs the direction of the ith edge region and +.>The relative distribution direction of the individual edge regions is denoted +.>。
It should be noted that, the direction of the connecting line in this embodiment refers to an angle between the straight line of the connecting line and the clockwise direction of the horizontal straight line.
In addition, the ith edge region and the thAny two pixel points on the edge area, wherein the two pixel points are respectively from the two edge areas to form any pixel point pair, and the Euclidean distance of the pixel point pair is calculated, so that the ith edge area and the (th) are obtained>The Euclidean distances of all pixel point pairs on each edge area are subjected to linear normalization, and the minimum value of the Euclidean distances is selected and recorded as +.>Wherein the Euclidean distance min +.>Two corresponding pixel points are used as the ith edge area and the (th)>End points between the edge regions;
thus for the ith edge region and the rest of theThe edge confidence of the same actual edge is obtained by the position distribution information of the edge regions:
wherein,,represents the ith edge region and the remaining +.>Edge confidence of the individual edge regions, +.>Represents the ith edge region and +.>Shortest Euclidean distance between pixel points in each edge region, < >>Indicates the region distribution direction of the ith edge region,/-, and>indicate->The direction of the area distribution of the individual edge areas, +.>Represents the ith edge region and +.>The relative distribution direction of the edge areas;
minimum Euclidean distanceThe smaller the corresponding ith edge region and +.>The closer the distance between the edge regions, the higher the likelihood of missing edge features in a local region intermediate the two edge regions than in the remaining edge regions, i.e. the edge confidence that the corresponding two edge regions belong to the same actual edgeThe greater the degree;
will beThe confidence factor is recorded as a confidence factor, which means that when the distribution directions of the two edge areas are similar to each other and also similar to the distribution direction of the opposite central position, the two edge areas are more consistent with the adjacent edge characteristics of the two ends of the broken part of a certain actual edge, so that the front and rear characteristic values are the same as the confidence expression logic, and the two characteristic values are important and need to be influenced each other, so that in order to prevent the situation that a local optimal solution is caused by overlarge of a certain characteristic value, the two characteristic values are combined in a multiplying mode, and the confidence obtained by the general direction at the back is also needed to be adjusted;
thus, for the ith edge region, whether the actual edge of the same object between the two edge regions is determined based on the edge confidence between the ith edge region and each edge region, when the edge confidenceAt maximum, then for the ith edge region and +.>The edge regions are their optimal matching results, i.e. the actual edges that are highly likely to belong to the same object.
Obtaining edge confidence between the ith edge region and other edge regionsObtaining the maximum edge confidence between the ith edge region and other edge regions, and adding the ith edge region and the (th) corresponding to the maximum edge confidence to the obtained maximum edge confidence>The distance between the end points of the edge areas is used as the diameter, a circular local area is set in the middle of the end points, and edges can exist as the circular local area between the two end pointsLocally suspicious regions with missing edges.
So far, the regional distribution direction of each edge region is obtained according to the furthest distance of the pixel points in each edge region, the relative distribution direction between the edge regions is obtained according to the closest distance of the pixel points between the edge regions, and the local suspicious region is obtained according to the regional distribution direction and the edge confidence obtained by the relative distribution direction.
Step S003, for each edge region, obtaining a local suspicious region according to the edge confidence, obtaining a path in the local suspicious region, and analyzing the path in the local suspicious region to obtain final adjustment parameters of the pixel point.
In the local suspicious region, two pixel points with the nearest distance between two edge regions are taken as two end points, one of the two end points is arbitrarily selected as a starting point, the other end point is taken as an end point, one pixel point belonging to the local suspicious region is randomly selected in the eight neighborhood range of the two end points as a second point on the path from the starting point, the second point is taken as the starting point, the operation when the second point is acquired from the starting point is repeated, after a third point is found, the operation is repeated until the end point appears in the neighborhood range, and at the moment, the first path is acquiredAnd repeating the operation until all possible paths are obtained, and obtaining a plurality of paths.
When each path is determined, the pixel point cannot be repeatedly determined, that is, the worst case is that a path just traverses the locally suspicious region once.
For a plurality of paths obtained in the local suspicious region, each path (the difference of standard deviation according to the fluctuation degree between the path and the corresponding edge region, namely the corresponding distance distribution is used as the selection of each path, and the optimization of the guiding filtering is performed) needs to be judged, but when the specific travelling direction distribution characteristics of the paths cannot be represented when the calculation is performed through the conventional standard deviation, the travelling directions are obviously different, but the obtained standard deviations are very similar, so that the standard deviation is required to be corrected according to the local characteristics of each pixel point on the path, and further the more accurate standard deviation capable of representing the trend distribution characteristics of the paths is obtained to perform optimal path selection.
When a pixel belongs to an actual edge, the gradient direction of the pixel is vertical to the direction of the edge.
For the firstThe path is used for obtaining a front pixel point and a rear pixel point of an mth pixel point on the nth path on the corresponding path, connecting the front pixel point and the rear pixel point, and obtaining the angle of a connecting line +.>The local distribution direction of the mth pixel point on the nth path is marked. Taking fig. 3 as an example, the pixel point 1 is assumed to be the mth pixel point, if the pixel points 4 and 2 are the front and rear pixel points, the connection angle is 135 °, and if the pixel points 4 and 8 are the front and rear pixel points, the connection angle is 90 °.
Taking any one of all paths as an example, and recording as a target path according to the first path on the target pathThe difference between each pixel point and the gradient direction of the pixel point obtains local linear characteristics,
wherein,,representing the local linear characteristics of the mth pixel point on the target path,/and>representing the local distribution direction of the mth pixel point on the target path,/for>Representing the gradient direction of the mth pixel point on the target path;
through the mth pixel point on the target path, the local distribution directions of the pixel points before and after the mth pixel point are respectively according to the local distribution directions of the pixel pointsIs +_associated with self gradient direction>And obtaining the difference between the two, and obtaining the degree of whether the mth pixel point on the target path accords with the edge linear characteristic, and marking the degree as the local linear characteristic of the mth pixel point on the target path, wherein when the linear direction of the local suspected region where the mth pixel point is positioned is more vertical to the gradient direction of the mth pixel point, the more likely that the pixel point is the pixel point on the lost edge information is indicated.
Step two, due toThe method only can represent the self credibility of the mth pixel point on the target according to the difference between the gradient direction and the local distribution direction of the mth pixel point, and can not represent the trend characteristic, so that the local trend characteristic of each pixel point is obtained through the trend similarity of the local range of each pixel point on the path, and can be brought into the standard deviation to represent the trend distribution characteristic of the whole path.
For the mth pixel point on the target path, respectively judging the mth pixel point on the target path according to the sequence of the pixel points in the pathThe absolute value of the difference between the gradient directions of the pixel points and the mth pixel point is marked as +.>Represents the mth pixel and +.>Gradient direction difference of each pixel point when +.>When the gradient direction difference of all the pixel points and the mth pixel point is obtained, all the pixel points on the target path, which are similar to the local trend of the mth pixel point, are obtained and marked as +.>And will be at this->The gradient direction difference between the c-th pixel point and the m-th pixel point in the pixel points is marked as +.>This is in this embodiment>All pixels in the pixel are different from the gradient direction of the mth pixel +.>And (5) performing linear normalization processing.
And then calculating to obtain local trend characteristics of the mth pixel point:
wherein,,representing local trend characteristics of the mth pixel,/->Representing a linear normalization function, ++>Representing the pixel point with the m-th pixel point on the target pathThe number of pixels with similar local trend, +.>Representing gradient direction difference between the c pixel point and the m pixel point;
the more all the pixel points on the target path, which are similar to the local trend of the mth pixel point, the larger the local range is, and meanwhile, the gradient direction of the pixel points in the range is very similar, the more the mth pixel point is shown to be similar to the trend in the local suspicious region, the higher the represented information degree is, and the higher the corresponding information amount in the trend distribution characteristics obtained by calculating the standard deviation is.
Step three, obtaining a connecting straight line corresponding to a starting point and an end point corresponding to the target path, and obtaining a vertical distance between an mth pixel point in the target path and the connecting straight line, and marking the vertical distance asAnd obtaining the average value of the vertical distances of all the pixel points of the target path, and recording the average value as the average vertical distance +.>;
According to the obtained local trend feature and local linear feature of the pixel point, obtaining the trend distribution feature of the target path, and then the trend distribution feature obtaining method of the target path comprises the following steps:
wherein,,represents the vertical distance between the mth pixel point and the connecting straight line in the target path, +.>Representing the average vertical distance between all pixels in the target path and the connecting straight line, +.>Representing the local linear characteristics of the mth pixel point on the target path,/and>represents the local trend feature of the mth pixel point, M represents the number of the pixel points contained in the target path,representing a base amplitude value;
wherein,,representing the difference value between the vertical distance of the mth pixel point in the target path and the average vertical distance between all the pixel points and the connecting straight line;Representing the gradient direction of the mth pixel point and the local distribution direction of the mth pixel point>When the difference is larger, it indicates that the pixel point is not consistent with the characteristic of the linear edge region, so the distribution trend characteristic of the path calculated by the standard deviation has smaller weight.The more the pixel points which are similar to the gradient direction of the mth pixel point, the longer the local similar region is, so that the stronger the local trend characteristic which is represented by the pixel point, and the higher the weight is when the pixel point participates in the calculation of the distribution region characteristic. Due to the->And->Are all weight adjustment values in standard deviation obtained by the mth pixel point according to the self characteristics, so +.>As a base amplitude value, the latter two are combined as weight values and are therefore directly multiplied here. Finally, the distribution trend characteristics of the nth path are obtained through calculation after accumulation, wherein compared with the standard deviation only, the trend obtained by the distribution characteristics of different trends is completely different by introducing the two weight values, and the trend distribution characteristics obtained by the improved standard deviation are more similar in local characteristics and longer in local characteristics, and the weight in the distribution trend characteristics is higher, so that compared with the conventional standard deviation, the obtained trend distribution characteristics can be divided into different areas.
Repeating the operation of acquiring the trend distribution characteristics of the target path to acquire the trend distribution characteristics of all paths, and likewise, taking two edge areas corresponding to the formed target path as edge paths, acquiring the trend distribution characteristics of the two edge paths corresponding to the target path by utilizing the acquisition method of the trend distribution characteristics of the target path, and respectively marking as、And respectively marking the two corresponding edge paths as an edge path 1 and an edge path 2, and obtaining local detail parameters according to the difference of trend distribution characteristics between the nth path and the corresponding edge paths 1 and 2.
Respectively obtaining the absolute value of the difference value of the trend distribution characteristics between the nth path and the corresponding edge path 1 and edge path 2, and recording the absolute value as the minimum value of the absolute value as the nth pathFirst feature of the strip path->The first features of all paths are subjected to linear normalization processing, and the first features of the nth path after linear normalization are expressed as +.>Local detail parameter of nth path +.>The acquisition method comprises the following steps:
wherein,,local detail parameter representing the nth path, < >>Absolute value of difference representing trend distribution characteristic between nth path and corresponding edge path 1 and edge path 2, +.>Representing the minimum value of acquisition->Representing a linear normalization function;
the above-mentioned consideration is that the paths in the locally suspicious region are closer to the adjacent corresponding edge paths 1 and 2, and there is a certain degree of local similarity, but the similarity between the edge paths 1 and 2 is not necessarily larger, so if the difference is represented by averaging, the obtained local detail parameters may be reduced due to the dissimilarity of one section, therefore, the minimum value is selected as the final value, and when the trend distribution characteristics of one path and one edge region are particularly similar, the path is more consistent with the continuous linear distribution characteristics of the same edge, and therefore, the obtained local detail parameters are larger.
Step four, according to the first and second steps, the probability that the local suspicious region may have missing features, namely, the probability that the local suspicious region may have missing features, which are obtained according to the similar features between the edge path 1 and the edge path 2, respectivelyEdge confidence of the responseAccording to the ith edge region and the +.>Obtaining a local detail parameter +.>;
For the ith edge region and the rest of theThe z pixel point in the local suspicious region corresponding to the edge region acquires all local detail parameters of a plurality of paths where the z pixel point is located, and the largest local detail parameter is taken as the i-th edge region and the rest of the z-th edge region>Local detail parameters of the z-th pixel point in the corresponding local suspicious region in the edge region are recorded asAnd combining the ith edge region corresponding to the suspicious region to which it belongs with the rest +.>Edge confidence between edge regions +.>Obtaining the ith edge region and the rest +.>In the locally suspicious region of the edge region, the final adjustment parameter of the z-th pixel point +.>,
Wherein,,represents the ith edge region and the remaining +.>Final adjustment parameters of the z-th pixel point in the locally suspicious region of the edge region, +.>Represents the ith edge region and the remaining +.>Edge confidence between the individual edge regions,represents the ith edge region and the remaining +.>Local detail parameters of a z-th pixel point in a local suspicious region corresponding to each edge region;
and carrying out the analysis on all edge areas in the infrared monitoring image to obtain final adjustment parameters of all pixel points in the infrared monitoring image.
And obtaining final adjustment parameters of all the pixel points according to the edge confidence and the local detail parameters.
And step S004, adjusting regularization parameters of the guided filtering algorithm according to the final adjustment parameters to obtain an improved guided filtering algorithm, and carrying out filtering enhancement on the infrared monitoring image.
Conventional guided filtering algorithms take the guided image as a template to obtain local features for each pixel in the infrared surveillance imageAnd->Partial characterization by means of final tuning parameters>The improved method for expressing the local features in the guide filtering algorithm comprises the following steps:
wherein,,represents the ith edge region and +.>In the locally suspicious region of the edge region +.>Improved local feature of individual pixels, < >>Variance representing gray value in infrared monitoring image,/->Representing the variance of the guide image, in order to strengthen the edges in the infrared monitoring image, the guide image is the infrared monitoring image, the variance of the guide image is the variance of the gray values in the infrared monitoring image, i.e.>ϵ shows regularization coefficients in the guided filtering algorithm,/->Represents the ith edge region and the remaining +.>Final adjustment parameters of the z-th pixel point in the local suspicious region of the edge region;
because when the z-th is in the original imageWhen the gradient characteristics of the region where the pixel point is located are weaker, the smaller the local characteristics of the guiding filtering algorithm are, the more the local characteristics of the guiding filtering algorithm are approaching to 0, the corresponding enhancement characteristics are not obtained at the moment, but are smoothed, so that the adjustment parameters of the z-th pixel point obtained in the fourth step are usedImprovement of local features +.>The larger the>The smaller the corresponding final regularization coefficient, the smaller the local feature of the improved filtering algorithm +.>The larger the final filter enhancement effect is; />
It should be noted that, when the above analysis calculation is performed on the infrared monitoring image, there are some pixels that may not participate in the calculation, so the final adjustment parameters of these pixels are obtainedSet to 1, i.e. processed according to a conventional guided filtering algorithm.
Therefore, by adjusting the numerical value of the local feature, the problem that the part of the edge feature almost lost is smoothed out due to the fact that the gradient feature is too weak is avoided, and meanwhile the part is amplified and enhanced based on the gradient feature of the part.
The method comprises the steps of carrying out Sobel operator gradient detection on an image, obtaining a plurality of edge areas according to set connected domain judging conditions, obtaining an optimal matching target according to similar characteristics among the edge areas, taking an endpoint corresponding to the shortest distance among the edge areas as a diameter, obtaining a local suspicious area, carrying out multi-path identification on the local suspicious area, carrying out weight adjustment on each pixel point in each path according to the difference between the self gradient direction and the local distribution direction, carrying out calculation on trend distribution characteristics according to the weight value of each path after combining with the improvement, obtaining edge confidence coefficient according to the edge areas on two sides corresponding to the trend distribution characteristics, finally obtaining adjustment parameters of each pixel point in the local area, and carrying out adjustment on parameters of a guide filtering algorithm to obtain the improved guide filtering algorithm.
And (3) utilizing the improved guiding filtering algorithm to carry out filtering enhancement on each frame of infrared monitoring image, wherein the guiding image is identical to the infrared monitoring image, so as to obtain an enhanced image, outputting more accurate judgment on whether smoke and flame appear on the infrared monitoring image through the VGGNet neural network mentioned in the step S001, and carrying out early warning when the smoke and flame exist at the identification position.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.
Claims (7)
1. The civil security engineering monitoring method based on computer vision is characterized by comprising the following steps of:
acquiring an infrared monitoring image, acquiring a gradient amplitude and a gradient direction, and marking a connected domain formed by pixel points larger than a gradient threshold as an edge region;
the slope directions corresponding to the two pixel points with the largest Euclidean distance in the edge area are marked as area distribution directions, and the slope directions corresponding to the two pixel points with the smallest Euclidean distance in any two edge areas are marked as relative distribution directions; obtaining a confidence coefficient factor according to the difference between the average value of the distribution directions of the areas and the relative distribution direction, and carrying out product correction on the minimum Euclidean distance by the confidence coefficient factor to obtain edge confidence coefficient;
marking any edge area as a target edge area, acquiring an edge area corresponding to the maximum edge confidence of the target edge area, marking the edge area as a second target edge area, acquiring the midpoint of two pixel points of the minimum Euclidean distance between the target edge area and the second target edge area, acquiring a local suspicious area according to the midpoint and the minimum Euclidean distance, and acquiring a plurality of paths from the starting point to the end point by taking the two pixel points as the starting point and the end point;
marking any path as a target path, acquiring the position direction of all pixel points in the target path between the last pixel point and the next pixel point in the path, marking the position direction as a local distribution direction, and acquiring local linear characteristics according to the difference between the local distribution direction and the gradient direction of the pixel points; marking the difference value of the gradient directions of the pixel points in the target path as gradient direction difference, and obtaining local trend characteristics according to the accumulated and summed result of the gradient direction difference;
obtaining vertical distances and average vertical distances of all pixel points in the target path, forming a straight line with the starting point and the end point, recording the difference value of the vertical distances and the average vertical distances as a basic amplitude value, and obtaining trend distribution characteristics according to the average value of the product correction result of the local linear characteristics and the local trend characteristics on the basic amplitude value; obtaining local detail parameters according to the minimum difference value of the trend distribution characteristics, and recording the product adjustment result of the local detail parameters and the edge confidence as a final adjustment parameter;
adjusting regularization parameters according to final adjustment parameters to obtain an improved guide filtering algorithm, and filtering and enhancing the infrared monitoring image;
the edge confidence coefficient is obtained by the following steps:
recording any edge area in the infrared monitoring image as a target edge area, calculating Euclidean distance of edge pixel points between every two edge pixel points in the target edge area, taking two pixel points with the farthest distance as two end points of the edge area, obtaining the slope of a connecting line of the two end points according to the coordinate positions of the two end points in the infrared monitoring image, and further obtaining a direction corresponding to the slope, wherein the direction is the area distribution direction of the target edge area;
acquiring the average value of the coordinate positions of two pixel points with the farthest Euclidean distance of any edge area in an image, namely the central position of the edge area, and then connecting the central positions of the target edge area and the rest edge areas, wherein the obtained connecting line direction is used as the relative distribution direction of the target edge area and the rest edge areas;
obtaining the shortest Euclidean distance between the pixel points in the target edge area and other edge areas;
the method for acquiring the edge confidence comprises the following steps:
wherein (1)>Represents the ith edge region and the remaining +.>Edge confidence of the individual edge regions, +.>Represents the ith edge region and +.>Shortest Euclidean distance between pixel points in each edge region, < >>Indicates the region distribution direction of the ith edge region,/-, and>indicate->The direction of the area distribution of the individual edge areas,represents the ith edge region and +.>The relative distribution direction of the edge areas;
the local trend feature is obtained by the following steps:
the local trend feature acquisition method comprises the following steps:
wherein (1)>Representing local trend characteristics of the mth pixel,/->Representing a linear normalization function, ++>Representing the number of pixels on the target path that are close to the local trend of the mth pixel,and the gradient direction difference of the c pixel point and the m pixel point is represented.
2. The civil security engineering monitoring method based on computer vision according to claim 1, wherein the edge area is obtained by the following steps:
and acquiring gradient information in the infrared monitoring image by utilizing a Sobel operator, namely acquiring the magnitude and the direction of the corresponding gradient amplitude, acquiring a segmentation threshold value of the gradient amplitude in the infrared monitoring image by utilizing an OTSU algorithm, marking the segmentation threshold value as a gradient threshold value, and carrying out connected domain analysis on all pixel points with the gradient amplitude larger than the gradient threshold value to acquire a plurality of connected domains, and marking the connected domains as edge areas.
3. The civil security engineering monitoring method based on computer vision according to claim 1, wherein the path is obtained by the following steps:
obtaining edge confidence coefficient between a target edge region and other edge regions, obtaining maximum edge confidence coefficient between the target edge region and other edge regions, taking the shortest Euclidean distance of an endpoint between the target edge region corresponding to the maximum edge confidence coefficient and the other edge regions as a diameter, taking a midpoint corresponding to the middle of the endpoint as a circle center, setting a circular local region, and recording the circular local region between the two endpoints as a local suspicious region;
in the local suspicious region, two pixel points with the nearest distance between two edge regions are taken as two end points, one of the two end points is arbitrarily selected as a starting point, the other end point is taken as an end point, one pixel point belonging to the local suspicious region is randomly selected as a second point on the path in the eight neighborhood range of the starting point, the operation when the second point is acquired from the starting point is repeated by the second point, the operation is repeated until the end point appears in the neighborhood range after the third point is found, and the first path is acquired at the moment, namelyAnd repeating the operation until all possible paths are obtained, and obtaining a plurality of paths.
4. The civil security engineering monitoring method based on computer vision according to claim 1, wherein the local linear characteristics are obtained by the following steps:
the local linear characteristic acquisition method comprises the following steps:
wherein (1)>Representing the local linear characteristics of the mth pixel point on the target path,/and>representing the local distribution direction of the mth pixel point on the target path,/for>Representing a target wayGradient direction of the mth pixel point on the diameter.
5. The civil security engineering monitoring method based on computer vision according to claim 1, wherein the trend distribution characteristics are obtained by the following steps:
and marking any path as a target path, wherein the trend distribution characteristic acquisition method of the target path comprises the following steps:
wherein M represents the number of pixels included in the target path, ">Represents the vertical distance between the mth pixel point and the connecting straight line in the target path, +.>Representing the average vertical distance between all pixels in the target path and the connecting straight line, +.>Representing the local linear characteristics of the mth pixel point on the target path,/and>representing local trend characteristics of the mth pixel,/->Representing the base amplitude.
6. The civil security engineering monitoring method based on computer vision according to claim 1, wherein the local detail parameters are obtained by the following steps:
respectively obtaining the absolute value of the difference value of the trend distribution characteristics between the nth path and the corresponding edge path 1 and edge path 2, and marking the absolute value as the minimum value of the absolute value as the nth pathFirst feature of the strip path->The first features of all paths are subjected to linear normalization processing, and the first features of the nth path after linear normalization are expressed as +.>;
Local detail parameters of the nth pathThe acquisition method comprises the following steps:
wherein (1)>Absolute value of difference representing trend distribution characteristic between nth path and corresponding edge path 1 and edge path 2, +.>Representing the minimum value of acquisition->Representing a linear normalization function.
7. The civil security engineering monitoring method based on computer vision according to claim 1, wherein the regularization parameter is adjusted according to the final adjustment parameter to obtain an improved guided filtering algorithm, and the method comprises the following specific steps:
the improved representation method of the local features in the guided filtering algorithm comprises the following steps:
wherein (1)>Represents the ith edge region and +.>In the locally suspicious region of the edge region +.>Improved local feature of individual pixels, < >>Variance representing gray value in infrared monitoring image,/->Representing guide image variance, wherein->ϵ shows regularization coefficients in the guided filtering algorithm,/->Represents the ith edge region and the remaining +.>Final adjustment parameters of the z-th pixel point in the locally suspicious region of the edge region.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310530853.5A CN116311079B (en) | 2023-05-12 | 2023-05-12 | Civil security engineering monitoring method based on computer vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310530853.5A CN116311079B (en) | 2023-05-12 | 2023-05-12 | Civil security engineering monitoring method based on computer vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116311079A CN116311079A (en) | 2023-06-23 |
CN116311079B true CN116311079B (en) | 2023-09-01 |
Family
ID=86830842
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310530853.5A Active CN116311079B (en) | 2023-05-12 | 2023-05-12 | Civil security engineering monitoring method based on computer vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116311079B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116758085B (en) * | 2023-08-21 | 2023-11-03 | 山东昆仲信息科技有限公司 | Visual auxiliary detection method for infrared image of gas pollution |
CN116977327B (en) * | 2023-09-14 | 2023-12-15 | 山东拓新电气有限公司 | Smoke detection method and system for roller-driven belt conveyor |
CN116993632B (en) * | 2023-09-28 | 2023-12-19 | 威海广泰空港设备股份有限公司 | Production fire early warning method based on machine vision |
CN117058625B (en) * | 2023-10-11 | 2024-01-16 | 济宁港航梁山港有限公司 | Campus fire control remote monitoring system based on thing networking |
CN118096723B (en) * | 2024-04-17 | 2024-07-02 | 海门裕隆光电科技有限公司 | Visual inspection method for production quality of robot cable |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108038887A (en) * | 2017-12-11 | 2018-05-15 | 天津大学 | Based on binocular RGB-D camera depth profile methods of estimation |
CN113837198A (en) * | 2021-05-18 | 2021-12-24 | 中国计量大学 | Improved self-adaptive threshold Canny edge detection method based on three-dimensional block matching |
CN114972357A (en) * | 2022-08-03 | 2022-08-30 | 南通恒立机械设备有限公司 | Roller surface defect detection method and system based on image processing |
CN115691026A (en) * | 2022-12-29 | 2023-02-03 | 湖北省林业科学研究院 | Intelligent early warning monitoring management method for forest fire prevention |
CN115841434A (en) * | 2023-02-21 | 2023-03-24 | 深圳市特安电子有限公司 | Infrared image enhancement method for gas concentration analysis |
-
2023
- 2023-05-12 CN CN202310530853.5A patent/CN116311079B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108038887A (en) * | 2017-12-11 | 2018-05-15 | 天津大学 | Based on binocular RGB-D camera depth profile methods of estimation |
CN113837198A (en) * | 2021-05-18 | 2021-12-24 | 中国计量大学 | Improved self-adaptive threshold Canny edge detection method based on three-dimensional block matching |
CN114972357A (en) * | 2022-08-03 | 2022-08-30 | 南通恒立机械设备有限公司 | Roller surface defect detection method and system based on image processing |
CN115691026A (en) * | 2022-12-29 | 2023-02-03 | 湖北省林业科学研究院 | Intelligent early warning monitoring management method for forest fire prevention |
CN115841434A (en) * | 2023-02-21 | 2023-03-24 | 深圳市特安电子有限公司 | Infrared image enhancement method for gas concentration analysis |
Non-Patent Citations (1)
Title |
---|
"基于整体结构相似的红外和可见光图像匹配方法研究";王琦;《中国博士学位论文全文数据库》;第33-61页 * |
Also Published As
Publication number | Publication date |
---|---|
CN116311079A (en) | 2023-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116311079B (en) | Civil security engineering monitoring method based on computer vision | |
CN112819772B (en) | High-precision rapid pattern detection and recognition method | |
JP3123587B2 (en) | Moving object region extraction method using background subtraction | |
CN114757900B (en) | Artificial intelligence-based textile defect type identification method | |
CN115082683A (en) | Injection molding defect detection method based on image processing | |
CN110298297B (en) | Flame identification method and device | |
CN107167810B (en) | Method for rapidly extracting underwater target by side-scan sonar imaging | |
TWI726321B (en) | Method, device and system for determining whether pixel positions in an image frame belong to a background or a foreground | |
CN116205919A (en) | Hardware part production quality detection method and system based on artificial intelligence | |
CN111553310B (en) | Security inspection image acquisition method and system based on millimeter wave radar and security inspection equipment | |
CN117218122B (en) | Watch shell quality detection method based on image data | |
CN109242032B (en) | Target detection method based on deep learning | |
CN114266743B (en) | FPC defect detection method, system and storage medium based on HSV and CNN | |
CN111612773B (en) | Thermal infrared imager and real-time automatic blind pixel detection processing method | |
CN115082429A (en) | Aluminum bar defect detection method based on image processing | |
CN115862259A (en) | Fire alarm early warning system based on temperature monitoring | |
CN116188468A (en) | HDMI cable transmission letter sorting intelligent control system | |
CN117745715A (en) | Large-caliber telescope lens defect detection method based on artificial intelligence | |
CN117115210A (en) | Intelligent agricultural monitoring and adjusting method based on Internet of things | |
CN116630332B (en) | PVC plastic pipe orifice defect detection method based on image processing | |
CN116993614A (en) | Defogging method for fused image of fine sky segmentation and transmissivity | |
CN113920065B (en) | Imaging quality evaluation method for visual detection system of industrial site | |
CN112085683B (en) | Depth map credibility detection method in saliency detection | |
CN117474916B (en) | Image detection method, electronic equipment and storage medium | |
CN110930435B (en) | Multi-background integrated infrared sequence moving object detection method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |