CN116029941B - Visual image enhancement processing method for construction waste - Google Patents

Visual image enhancement processing method for construction waste Download PDF

Info

Publication number
CN116029941B
CN116029941B CN202310300459.2A CN202310300459A CN116029941B CN 116029941 B CN116029941 B CN 116029941B CN 202310300459 A CN202310300459 A CN 202310300459A CN 116029941 B CN116029941 B CN 116029941B
Authority
CN
China
Prior art keywords
pixel point
point set
feature
characteristic
characteristic pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310300459.2A
Other languages
Chinese (zh)
Other versions
CN116029941A (en
Inventor
黎壬志
朱创
鲁力
肖鹏
董维军
杨升
李靖
杨翠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Rongcheng Environmental Protection Technology Co ltd
Original Assignee
Hunan Rongcheng Environmental Protection Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Rongcheng Environmental Protection Technology Co ltd filed Critical Hunan Rongcheng Environmental Protection Technology Co ltd
Priority to CN202310300459.2A priority Critical patent/CN116029941B/en
Publication of CN116029941A publication Critical patent/CN116029941A/en
Application granted granted Critical
Publication of CN116029941B publication Critical patent/CN116029941B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02WCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
    • Y02W30/00Technologies for solid waste management
    • Y02W30/50Reuse, recycling or recovery technologies
    • Y02W30/58Construction or demolition [C&D] waste

Abstract

The invention relates to the technical field of image processing, in particular to a visual image enhancement processing method of construction waste. The method comprises the following steps: according to the gradient direction of the pixel points in the gray level image of the construction waste, obtaining each characteristic pixel point set, and obtaining the discrete degree of the gradient direction of the characteristic pixel points in each characteristic pixel point set; obtaining the irregularity degree corresponding to each characteristic pixel point set according to the relative positions between the characteristic pixel points and the central lines of the corresponding characteristic pixel point sets; and obtaining a first possibility index based on the discrete degree and the irregular degree, obtaining a second possibility index according to the target connected domain corresponding to each characteristic pixel point set and the upper and lower gray scale limit values, further determining the edge confidence, obtaining a corresponding self-adaptive superposition coefficient based on the edge confidence, and further obtaining the enhanced image. The invention improves the enhancement effect of the construction waste image.

Description

Visual image enhancement processing method for construction waste
Technical Field
The invention relates to the technical field of image processing, in particular to a visual image enhancement processing method of construction waste.
Background
The recycling of construction waste is one of the hot topics at present, and a large number of construction waste such as bricks, stones and concrete left after the removal of waste buildings can be recycled as a renewable resource after being sorted, removed or crushed, but the crushed aggregate is utilized, the crushed aggregate is required to be sorted according to the size, shape and the like of the aggregate, and the crushed aggregate is utilized for different purposes.
The common sharpening method is to enhance the edge details of the image through a sharpening mask algorithm, and although the method is simple, the enhancement effect on the high-frequency information such as the edge and details of the image is obvious, noise information belonging to the high-frequency part is increased, and in the process of overlapping the images, the edge protruding part spans larger jump, so that the image is overshot and undershoot, and the enhancement effect of the image is poor.
Disclosure of Invention
In order to solve the problem that the enhancement effect is poor when the existing sharpening mode is used for enhancing the image, the invention aims to provide a visual image enhancement processing method for construction waste, which adopts the following technical scheme:
the invention provides a visual image enhancement processing method of construction waste, which comprises the following steps:
acquiring a gray image of the construction waste to be detected;
obtaining each characteristic pixel point set according to the gradient direction of each pixel point in the gray level image and the gradient direction of the pixel point in the preset adjacent area of each pixel point; obtaining the degree of dispersion of the gradient direction of the characteristic pixel points in each characteristic pixel point set; obtaining the irregularity degree corresponding to each characteristic pixel point set according to the relative positions between the characteristic pixel points and the central lines of the corresponding characteristic pixel point sets; obtaining a first possibility index corresponding to each characteristic pixel point set based on the discrete degree and the irregular degree;
obtaining a target connected domain corresponding to each characteristic pixel point set and a corresponding gray upper and lower limit value based on the gray value of each characteristic pixel point in each characteristic pixel point set and the gray value of each pixel point in a preset neighborhood of each characteristic pixel point, and obtaining a second possibility index corresponding to each characteristic pixel point set according to the target connected domain and the gray upper and lower limit value;
Determining edge confidence coefficient of each characteristic pixel point set based on the first probability index and the second probability index, and obtaining an adaptive superposition coefficient corresponding to each characteristic pixel point set based on the edge confidence coefficient; and obtaining an enhanced image based on the adaptive superposition coefficient and the gray scale image.
Preferably, the obtaining each feature pixel point set according to the gradient direction of each pixel point in the gray scale image and the gradient direction of the pixel point in the preset neighborhood of each pixel point includes:
based on a region growing algorithm, each pixel point in the gray level image is used as an initial growing point for growing, and the growing conditions are as follows: judging whether the difference between the growth point and the gradient direction of each pixel point in the preset neighborhood is smaller than a gradient difference threshold value, and if so, taking the corresponding neighborhood pixel point as a new growth point;
and taking the set of all the growing points after the growth is completed as a characteristic pixel point set corresponding to the initial growing point.
Preferably, the obtaining the degree of dispersion of the gradient direction of the feature pixel point in each feature pixel point set includes:
according to the gradient directions of the characteristic pixel points in each characteristic pixel point set, calculating the standard deviation of the gradient directions of all the characteristic pixel points in each characteristic pixel point set, and taking the standard deviation as the discrete degree of the gradient directions of the characteristic pixel points in the corresponding characteristic pixel point set.
Preferably, the obtaining the irregularity degree corresponding to each feature pixel point set according to the relative position between each feature pixel point and the center line of the corresponding feature pixel point set includes:
for the first
Figure SMS_1
A set of feature pixels:
respectively calculate the first
Figure SMS_2
Every characteristic pixel point in every characteristic pixel point set to the first
Figure SMS_3
The vertical distance of the central line of each characteristic pixel point set is calculated according to the vertical distance
Figure SMS_4
Standard deviation of vertical distances from all feature pixel points in the feature pixel point set to the central line; respectively acquiring the feature pixel points with the farthest left and right sides of the center point of the feature pixel point set and the farthest distance from the center point, calculating the distance between the two feature pixel points, and taking the distance as the first feature pixel point
Figure SMS_5
The distance between the head end point and the tail end point of the characteristic pixel point set;
according to the first
Figure SMS_6
The number, the first, of the feature pixel points in the feature pixel point set
Figure SMS_7
The distance between the head point and the tail point of the feature pixel point set and the third point
Figure SMS_8
Calculating standard deviation of vertical distances between all feature pixel points in the feature pixel point set and the central line
Figure SMS_9
The irregularity degree corresponding to the characteristic pixel point set;
said first
Figure SMS_10
The acquisition process of the central line of each characteristic pixel point set comprises the following steps: acquisition of the first
Figure SMS_11
Coordinates of a center point of the set of feature pixel points; calculate the first
Figure SMS_12
The average value of the gradient directions of all the characteristic pixel points in the characteristic pixel point set is taken as the first
Figure SMS_13
Average gradient directions corresponding to the feature pixel point sets; through the first step
Figure SMS_14
The central point of each characteristic pixel point set is used as a straight line perpendicular to the direction of the average gradient direction as the first point
Figure SMS_15
Center lines of the feature pixel point sets.
Preferably, the following formula is used to calculate the first
Figure SMS_16
The irregularity degree corresponding to each characteristic pixel point set is as follows:
Figure SMS_17
wherein ,
Figure SMS_20
is the first
Figure SMS_23
The corresponding degree of irregularity of the feature pixel point sets,
Figure SMS_26
is the first
Figure SMS_19
The number of feature pixels in the set of feature pixels,
Figure SMS_22
is the first
Figure SMS_25
The distance between the head and tail end points of the set of feature pixels,
Figure SMS_27
is the first
Figure SMS_18
Standard deviation of vertical distances from all feature pixels in the set of feature pixels to the center line,
Figure SMS_21
for the preset standard deviation threshold value,
Figure SMS_24
is a logarithmic function based on natural constants.
Preferably, the obtaining, based on the discrete degree and the irregular degree, a first likelihood index corresponding to each feature pixel point set includes:
and calculating products of the irregularity degree and the discrete degree corresponding to each characteristic pixel point set, and taking the products as first possibility indexes corresponding to each characteristic pixel point set.
Preferably, the obtaining the target connected domain and the corresponding upper and lower gray-scale limit values corresponding to each feature pixel point set based on the gray-scale values of each feature pixel point in each feature pixel point set and the gray-scale values of the pixel points in the preset neighborhood of each feature pixel point includes:
for the first
Figure SMS_28
A set of feature pixels:
if at first
Figure SMS_29
Only one pixel point corresponding to the minimum gray value in the characteristic pixel point set is provided, and the first pixel point is then
Figure SMS_30
The pixel point corresponding to the minimum gray value in the characteristic pixel point set is used as an initial center point, the initial center point is used as a starting point for carrying out connected domain analysis,obtaining a target connected domain; if more than one pixel point corresponding to the minimum gray value in the feature pixel point set is used, each pixel point corresponding to the minimum gray value in the feature pixel point set is used as an initial center point, the initial center point is used as a starting point to conduct connected domain analysis, a plurality of connected domains are obtained, the minimum gray value of the pixel points in each connected domain is used as a reference gray value, and the connected domain corresponding to the minimum reference gray value is used as a first connected domain
Figure SMS_31
A target connected domain corresponding to the characteristic pixel point set;
obtaining the minimum gray value of the pixel points in the target connected domain; calculate the first
Figure SMS_32
The difference between the maximum gray value of the feature pixel points in the feature pixel point set and the minimum gray value of the pixel points in the target connected domain is used as the first gray value
Figure SMS_33
And the corresponding gray scale upper and lower limit values are collected by the characteristic pixel points.
Preferably, the obtaining the second probability index corresponding to each feature pixel point set according to the target connected domain and the gray upper and lower limit values includes:
and calculating products of the gray upper and lower limit values corresponding to the characteristic pixel point sets and the number of the pixel points in the corresponding target connected domain, and taking the products as second possibility indexes corresponding to the characteristic pixel point sets.
Preferably, the first likelihood indicator and the second likelihood indicator are both in positive correlation with edge confidence.
Preferably, the obtaining the adaptive stacking coefficient corresponding to each feature pixel point set based on the edge confidence includes:
for the first
Figure SMS_34
A set of feature pixels:
when the first is
Figure SMS_35
When the edge confidence coefficient of each characteristic pixel point set is smaller than or equal to 0, taking a constant 1 as an adaptive superposition coefficient corresponding to the characteristic pixel point set; when the first is
Figure SMS_36
When the edge confidence coefficient of each feature pixel point set is greater than 0, calculating the sum of the constant 1 and the edge confidence coefficient of the feature pixel point set, and taking the sum as the first
Figure SMS_37
And the self-adaptive superposition coefficients corresponding to the characteristic pixel point sets.
The invention has at least the following beneficial effects:
1. the invention takes the conventional unsharp masking algorithm into consideration, and uses conventional mean filtering to blur an image, the obtained enhancement result not only enhances the edge of the image, but also enhances the extraneous factors such as partial noise, texture on the surface of aggregate and the like, and the invention carries out linear connected region judgment by taking the gradient direction as a judgment basis to obtain a plurality of characteristic pixel point sets, judges the edge confidence of each characteristic pixel point set by combining the characteristics of the aggregate after the construction waste is crushed, sets different superposition coefficients for different regions in the gray image of the construction waste to be detected based on the edge confidence, and has less change on the extraneous regions such as aggregate surface texture, noise point and the like when the edge information between aggregates is enhanced compared with the conventional high-frequency image and the original image is superposed, thereby achieving the purpose of only enhancing the edge of the aggregate, and effectively improving the accuracy of aggregate detection.
2. According to the method, when the edge confidence of the feature pixel point set is judged, linear features and gray level features of the feature pixel point set are analyzed from two angles, according to the discrete degree and the irregular degree of the gradient direction of the feature pixel point in the feature pixel point set, a first possibility index corresponding to the feature pixel point set is obtained, according to a target connected domain corresponding to the feature pixel point set and a corresponding gray level upper and lower limit difference value, a second possibility index corresponding to the feature pixel point set is obtained, the first possibility index determines the possibility that an area formed by each feature pixel point set is an aggregate edge according to the shape feature of the aggregate edge, the second possibility index determines the possibility that an area formed by each feature pixel point set is an aggregate edge according to the gray level feature of the aggregate surface, and the surface feature of construction waste is comprehensively analyzed, so that the reliability of a judging result of the edge confidence is higher.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a visual image enhancement processing method of construction waste.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following detailed description is given to a visual image enhancement processing method of construction waste according to the invention by combining the attached drawings and the preferred embodiment.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the visual image enhancement processing method for construction waste provided by the invention with reference to the accompanying drawings.
An embodiment of a visual image enhancement processing method for construction waste comprises the following steps:
the embodiment provides a visual image enhancement processing method of construction waste, as shown in fig. 1, the visual image enhancement processing method of construction waste of the embodiment comprises the following steps:
step S1, acquiring a gray image of the construction waste to be detected.
The specific scene aimed at by this embodiment is: after the solid construction waste such as stone, bricks and concrete is broken into recycled aggregate, surface images of the construction waste to be detected are collected, and interference information irrelevant to the required aggregate edge and textures, noise points and the like in the images is distinguished by combining with actual edge characteristics of the aggregate, so that self-adaptive superposition coefficients are obtained, and further different areas are reinforced to different extents.
The method comprises the steps of breaking solid construction waste such as stone, bricks and concrete into recycled aggregate, conveying the recycled aggregate through conveying equipment such as a conveying belt, forming a stacked material pile at a discharge hole, conveying the stacked material pile to a detection area, arranging an industrial camera right above the detection area, overlooking the surface image of the construction waste to be detected by the industrial camera, and carrying out graying treatment on the collected surface image of the construction waste to be detected to obtain a gray image of the construction waste to be detected. The graying process is the prior art and will not be described here in detail.
Step S2, obtaining each characteristic pixel point set according to the gradient direction of each pixel point in the gray level image and the gradient direction of the pixel point in the preset adjacent area of each pixel point; obtaining the degree of dispersion of the gradient direction of the characteristic pixel points in each characteristic pixel point set; obtaining the irregularity degree corresponding to each characteristic pixel point set according to the relative positions between the characteristic pixel points and the central lines of the corresponding characteristic pixel point sets; and obtaining a first possibility index corresponding to each characteristic pixel point set based on the discrete degree and the irregular degree.
The conventional anti-sharpening mask algorithm is to obtain a high-frequency image by subtracting the blurred image from the original image, and superimpose the high-frequency image and the original image in the same proportion. In the embodiment, the gradient direction of the pixel points is taken as a judgment basis, the collection of the linear areas is obtained, the confidence coefficient analysis is carried out by combining the actual aggregate characteristics of the recycled concrete, and when the high-frequency information image is overlapped with the original image, different overlapping degrees are given according to the confidence coefficient of different high-frequency areas, so that the actual edge is enhanced to the greatest extent.
The construction waste breaks up the regenerated aggregate and presents irregular edge characteristics, although the surface of the aggregate also has the edge characteristics, the gray gradient of the edge characteristics generated when the aggregates are stacked in a staggered way is larger, and the aggregate is also a linear edge, but in the pixel level, the surface texture is smoother, and the aggregate edge is irregular. In this embodiment, the gradient direction of each pixel point in the gray level image of the construction waste to be detected is calculated first, and the calculating method of the gradient direction is the prior art and will not be described in detail here.
For any pixel point in the gray level image of the construction waste to be detected: marking the pixel point as a characteristic pixel point, acquiring the gradient direction of each pixel point in the preset neighborhood of the pixel point, respectively calculating the absolute value of the difference value between the gradient direction of the pixel point and the gradient direction of each pixel point in the preset neighborhood of the pixel point, setting a gradient difference threshold value as the difference between the gradient direction of the pixel point and the gradient direction of the corresponding pixel point in the preset neighborhood of the pixel point, judging whether the difference of the gradient directions is smaller than the gradient difference threshold value, taking the corresponding neighborhood pixel point as the characteristic pixel point if the difference of the gradient direction is smaller than the gradient difference threshold value, continuing taking the characteristic pixel point as the center, and acquiring the gradient direction of each pixel in the preset neighborhood of the characteristic pixel, respectively calculating the absolute value of the difference value between the gradient direction of the characteristic pixel and the gradient direction of each pixel in the preset neighborhood of the characteristic pixel, taking the absolute value as the gradient direction difference of the corresponding pixel in the preset neighborhood of the characteristic pixel, repeating the judging process until the gradient direction difference of the characteristic pixel and all pixels in the preset neighborhood of the characteristic pixel is larger than or equal to a gradient difference threshold value, and taking a set formed by all pixels in a communication domain formed by the pixel and the characteristic pixel as a characteristic pixel set. Meanwhile, other pixel points in the gray level image of the construction waste to be detected are judged by adopting the method to obtain a plurality of characteristic pixel point sets, and the embodiment grows in regions Based on an algorithm, each pixel point in the gray level image is used as an initial growth point for growth, and the growth conditions are as follows: judging whether the difference between the growth point and the gradient direction of each pixel point in the preset neighborhood is smaller than a gradient difference threshold value, and if so, taking the corresponding neighborhood pixel point as a new growth point; and taking the set of all the growing points after the growth is completed as a characteristic pixel point set corresponding to the initial growing point. The preset neighborhood in this embodiment is eight neighbors, and the gradient difference threshold is
Figure SMS_38
In a specific application, the implementer may set himself.
Considering that the standard deviation can reflect the discrete degree of one data set, the embodiment calculates the standard deviation of the gradient directions of all the characteristic pixel points in each characteristic pixel point set according to the gradient direction of each characteristic pixel point in each characteristic pixel point set as the discrete degree of the gradient directions of the characteristic pixel points in the corresponding characteristic pixel point set; for the first
Figure SMS_39
The corresponding discrete degree of the feature pixel point sets is as follows:
Figure SMS_40
wherein ,
Figure SMS_43
is the first
Figure SMS_45
The degree of dispersion of the gradient direction of the characteristic pixel points in the characteristic pixel point set,
Figure SMS_47
is the first
Figure SMS_42
The number of feature pixels in the set of feature pixels,
Figure SMS_44
Is the first
Figure SMS_48
The first characteristic pixel point set
Figure SMS_49
The gradient direction of each characteristic pixel point,
Figure SMS_41
is the first
Figure SMS_46
The average value of the gradient directions of all the characteristic pixel points in the characteristic pixel point set.
First, the
Figure SMS_50
The larger the degree of dispersion of the gradient direction of the characteristic pixel points in the characteristic pixel point set is, the description is that
Figure SMS_51
The more discrete the gradient direction distribution of the feature pixels in the set of feature pixels, the less fully pointing in the same direction. First, the
Figure SMS_52
The smaller the degree of dispersion of the gradient direction of the characteristic pixel points in the characteristic pixel point set is, the description of the first
Figure SMS_53
The more single the gradient direction of the feature pixels in the set of feature pixels.
By adopting the method, the degree of dispersion of the gradient direction of the characteristic pixel points in each characteristic pixel point set can be obtained.
Considering that the degree of dispersion of the gradient direction cannot fully reflect whether the region formed by the characteristic pixel points in the characteristic pixel point set is an aggregate edge, besides the aggregate edge, a part of texture reserved before the crushing of the construction waste is always present on the aggregate surface, and the difference between the part of texture and the edge is that the gradient direction on the texture of the part of pixel points is tidy in the gradient direction of the component pixel points, and the edge is the edgeThe gradient direction of the edge pixels is more discrete. Thus, for the first
Figure SMS_54
A characteristic pixel point set, and calculating the average value of the coordinates of all characteristic pixel points in the characteristic pixel point set
Figure SMS_55
And as coordinates of the center point of the feature pixel point set; and calculating the average value of the gradient directions of all the characteristic pixel points in the characteristic pixel point set
Figure SMS_56
As the average gradient direction corresponding to the characteristic pixel point set, the coordinates of the center point are crossed
Figure SMS_57
Making a straight line perpendicular to the average gradient direction, taking the straight line as the central line of the characteristic pixel point set, respectively calculating the vertical distance from each characteristic pixel point in the characteristic pixel point set to the central line of the characteristic pixel point set, and according to the first step
Figure SMS_58
Calculating the vertical distance between each characteristic pixel point in the characteristic pixel point set and the central line
Figure SMS_59
Standard deviation of vertical distances from all feature pixel points in the feature pixel point set to the central line; meanwhile, respectively acquiring the characteristic pixel points with the farthest left and right sides of the center point of the characteristic pixel point set from the center point, namely selecting two characteristic pixel points, taking one of the two characteristic pixel points as a head point of the characteristic pixel point set, taking the other of the two characteristic pixel points as a tail point of the characteristic pixel point set, and calculating the distance between the head point and the tail point of the characteristic pixel point set; the calculation formula of the vertical distance from the point to the straight line is the prior art, and is not repeated here.
Aggregate materialThere are certain texture features on the surface, which are similar to the edge features, are linear features as well, but differ in that the texture features of the aggregate surface are more regular and the edge is actually an uneven curve, thus the first
Figure SMS_60
The standard deviation of the vertical distances from the center line to all the feature pixels in the feature pixel point set is not 0, and based on this, the embodiment distinguishes a flat straight line from an uneven curve according to the first embodiment
Figure SMS_61
The number, the first, of the feature pixel points in the feature pixel point set
Figure SMS_62
The distance between the head point and the tail point of the feature pixel point set and the third point
Figure SMS_63
Calculating standard deviation of vertical distances between all feature pixel points in the feature pixel point set and the central line
Figure SMS_64
The irregularity degree corresponding to the feature pixel point sets is as follows:
Figure SMS_65
wherein ,
Figure SMS_68
is the first
Figure SMS_71
The corresponding degree of irregularity of the feature pixel point sets,
Figure SMS_73
is the first
Figure SMS_67
Feature pixel points in the feature pixel point setThe number of the pieces of the plastic material,
Figure SMS_70
is the first
Figure SMS_74
The distance between the head and tail end points of the set of feature pixels,
Figure SMS_75
is the first
Figure SMS_66
Standard deviation of vertical distances from all feature pixels in the set of feature pixels to the center line,
Figure SMS_69
for the preset standard deviation threshold value,
Figure SMS_72
Is a logarithmic function based on natural constants.
Figure SMS_85
Characterization of the first embodiment
Figure SMS_79
The degree to which the region formed by the collection of individual feature pixels approaches a straight line is greater, indicating the first
Figure SMS_81
The shape of the region formed by the characteristic pixel points in the characteristic pixel point set is not close to a flat straight line, namely, the shape is more consistent with the actual aggregate edge characteristics.
Figure SMS_89
By applying the formula to
Figure SMS_93
Setting parameters in
Figure SMS_92
And a preset standard deviation threshold
Figure SMS_94
When the first
Figure SMS_87
When the distribution of the pixels in the region where the individual feature pixel groups are formed is almost the width of one pixel,
Figure SMS_91
the value of (2) approaches 0, which is close to the more regular texture edge feature, to obtain
Figure SMS_76
The value of (2) is a negative number; and when the first
Figure SMS_83
When the distribution of the pixels in the characteristic pixel point set is less regular,
Figure SMS_78
there will be an increase in the number of times that,
Figure SMS_82
to a certain extent, obtained
Figure SMS_84
The value of (2) is greater than 0, and after the independent variable of the logarithmic function based on natural constant reaches a certain value, the increasing amplitude is very small, so that the embodiment mainly plays a role in distinguishing texture edge characteristics from actual aggregate edge characteristics by setting the formula, and the preset standard deviation threshold value in the embodiment
Figure SMS_88
The value of (2) is 0.1, and in specific applications, the practitioner can set according to specific situations; in the embodiment, the aim of distinguishing is mainly achieved by translating the logarithmic function and adjusting the amplitude, and the amplitude is mainly determined by
Figure SMS_77
Represented as a radix. Due to
Figure SMS_80
And
Figure SMS_86
all are positively correlated with the degree of irregularity, so the combination of the two methods using multiplication means gives an explanation that the higher the degree of irregularity finally obtained is
Figure SMS_90
The higher the probability that the set of individual feature pixels is the aggregate edge.
By adopting the method, the irregularity degree corresponding to each characteristic pixel point set can be obtained.
For the first
Figure SMS_106
The corresponding irregularity degree and the discrete degree of the characteristic pixel point set are in positive correlation with the possibility of belonging to the aggregate edge, so that multiplication is used as a combination mode to calculate the first
Figure SMS_96
Irregularity degree corresponding to each characteristic pixel point set
Figure SMS_102
And the first
Figure SMS_98
Degree of dispersion of gradient direction of feature pixel points in feature pixel point set
Figure SMS_100
Is taken as the product of (1)
Figure SMS_105
The first probability indexes corresponding to the characteristic pixel point sets are used for representing the probability that the characteristic pixel point sets are aggregate edges; when the first is
Figure SMS_109
When the difference between the shape feature and the edge feature shown by the feature pixel point collection region is large, the corresponding first probability index is extremely small, even negative, and when the first probability index is
Figure SMS_104
When the characteristic pixel point set areas gradually approach to the edge characteristics, the corresponding first possibility indexes are gradually increased; the reason for using multiplication here is that
Figure SMS_108
Irregularity degree corresponding to each characteristic pixel point set
Figure SMS_95
And the first
Figure SMS_101
Degree of dispersion of gradient direction of feature pixel points in feature pixel point set
Figure SMS_99
The two characteristic quantities have different value ranges and discrete degrees
Figure SMS_103
The value of (2) is larger, and the irregularity degree is higher
Figure SMS_107
The amplitude of (a) does not change greatly after reaching a certain value, and the texture edge characteristics and the aggregate actual edge characteristics are distinguished through signs, so that the irregularity degree is high
Figure SMS_110
Mainly representing symbols, and degree of discretization
Figure SMS_97
The magnitude of the amplitude is mainly characterized, so here multiplication is used for combining to obtain the corresponding first likelihood indicator. By adopting the method, the first possibility index corresponding to each characteristic pixel point set can be obtained.
And S3, obtaining a target connected domain corresponding to each characteristic pixel point set and a corresponding gray upper and lower limit difference value based on the gray value of each characteristic pixel point in each characteristic pixel point set and the gray value of the pixel point in the preset neighborhood of each characteristic pixel point, and obtaining a second possibility index corresponding to each characteristic pixel point set according to the target connected domain and the gray upper and lower limit difference value.
In the embodiment, in step S2, the judgment is performed based on the shape feature of each feature pixel point set, but the texture of the surface of the construction waste is complex, and the shape feature of the partial texture is very close to the edge of the aggregate, so that the aggregate edge needs to be further distinguished according to the feature that the edge of the aggregate is shadow caused by stacking of aggregates.
For the first
Figure SMS_111
A set of feature pixels:
acquiring the gray value of each feature pixel point in the feature pixel point set, and if the minimum gray value in the feature pixel point set
Figure SMS_120
Only one corresponding pixel point is provided, and the minimum gray value in the characteristic pixel point set is obtained
Figure SMS_112
The corresponding pixel point is used as an initial center point, the initial center point is used as a starting point for carrying out connected domain analysis, and the connected domain analysis adopts a region growing algorithm, and the growing conditions in the embodiment are as follows: judging whether the gray value of each pixel point in the preset neighborhood of the growth point is smaller than the gray value of the growth point, and if so, taking the corresponding neighborhood pixel point as a new growth point; specifically, the gray value of each pixel point in the preset neighborhood of the initial center point is obtained, namely
Figure SMS_118
The preset neighborhood in this embodiment is eight neighbors, so in this embodiment
Figure SMS_121
The value of (2) is
Figure SMS_125
The gray value of the initial center point is
Figure SMS_124
If (if)
Figure SMS_129
Judging that the initial center point is combined with the mth pixel point in the preset neighborhood of the initial center point; repeating the judgment by taking the mth pixel point as a center point until the gray values of all the pixel points in the preset neighborhood of the center point are larger than or equal to the gray value of the center point; if it is
Figure SMS_122
And if so, indicating that the initial center point and the mth pixel point in the preset adjacent area do not belong to the same connected domain, and not carrying out merging processing. By adopting the method, the pixel points meeting the judging conditions are combined to obtain a connected domain, which is marked as the first
Figure SMS_127
A target connected domain corresponding to the characteristic pixel point set; if the minimum gray value in the characteristic pixel point set
Figure SMS_115
More than one corresponding pixel points are used, and the minimum gray value in the characteristic pixel point set is respectively calculated
Figure SMS_117
Repeating the above operation to obtain multiple connected domains, obtaining minimum gray value of pixel point in each connected domain, using the minimum gray value as reference gray value, and using connected domain corresponding to the minimum reference gray value as the first pixel
Figure SMS_123
A target connected domain corresponding to the characteristic pixel point set; obtaining minimum gray value of pixel point in target connected domain
Figure SMS_126
The method comprises the steps of carrying out a first treatment on the surface of the Calculate the first
Figure SMS_128
Maximum gray value of feature pixel points in feature pixel point set
Figure SMS_130
Minus the first
Figure SMS_113
Minimum gray value of pixel points in target connected domain corresponding to each characteristic pixel point set
Figure SMS_116
The obtained difference is recorded as the first
Figure SMS_114
The gray upper and lower limit difference values corresponding to the characteristic pixel point sets are larger, and the description is that the first
Figure SMS_119
The more the region formed by the characteristic pixel point sets accords with the characteristics of the actual aggregate edge.
First, the
Figure SMS_131
The width of the region formed by the feature pixel point set is not necessarily only one pixel point, but the gray scale existing at the edge has a certain decreasing feature, the edge features formed by stacking aggregate are mainly in shadow shape, the texture features formed by the texture protrusions and the like on the surface of the aggregate are also in gray scale difference, but the difference is quite different from the actual edge features, and the determination of whether the region formed by the feature pixel point set is a real aggregate edge cannot be completely explained, so the embodiment further determines the gray scale information
Figure SMS_132
Gray upper and lower limit difference values corresponding to the characteristic pixel point sets and the first
Figure SMS_133
The number of the pixel points in the target connected domain corresponding to the characteristic pixel point set is determined
Figure SMS_134
Feature pixel dot setA corresponding second probability index for characterizing the probability of the characteristic pixel point set as aggregate edge from the gray characteristic point of view, a first
Figure SMS_135
The second probability indexes corresponding to the characteristic pixel point sets are as follows:
Figure SMS_136
wherein ,
Figure SMS_137
is the first
Figure SMS_138
A second likelihood indicator corresponding to the set of feature pixels,
Figure SMS_139
is the first
Figure SMS_140
The corresponding gray upper and lower limit values of the feature pixel point sets,
Figure SMS_141
Is the first
Figure SMS_142
The number of pixels in the target connected domain corresponding to the set of feature pixels.
First, the
Figure SMS_145
The larger the difference value between the upper and lower gray scales corresponding to the characteristic pixel point set is, the description is that
Figure SMS_149
The more the region formed by the characteristic pixel point sets accords with the characteristics of the aggregate edge;
Figure SMS_151
the larger the value of (a) is, the description of the (b)
Figure SMS_146
The greater the probability that adjacent regions of the region constituted by the collection of individual feature pixel points are shadow regions formed by stacking between aggregates. The gray value of the shadow area is smaller and has larger difference with the gray value of the aggregate after the concrete is broken, gaps formed by stacking among aggregates show shadow features, but smaller spots exist on the surface of the aggregate, and the largest difference with the shadow area is the size of the area, so according to the first aspect
Figure SMS_150
Gray level upper and lower limit difference values corresponding to each characteristic pixel point set and the first
Figure SMS_153
The area of the target connected domain corresponding to the characteristic pixel point set is determined
Figure SMS_155
The feature pixel point sets are the possibility of aggregate edges. When the first is
Figure SMS_143
The larger the difference value between the upper and lower limits of gray scale corresponding to each characteristic pixel point set is
Figure SMS_148
When the number of the pixel points in the target connected domain corresponding to the characteristic pixel point set is larger, the first
Figure SMS_152
The larger the second probability index corresponding to the characteristic pixel point set is; when the first is
Figure SMS_154
The smaller the difference value between the upper and lower limits of gray scale corresponding to each characteristic pixel point set is
Figure SMS_144
When the number of the pixel points in the target connected domain corresponding to the characteristic pixel point set is smaller, the first
Figure SMS_147
Corresponding to the set of characteristic pixel pointsThe smaller the second likelihood index.
By adopting the method, the second possibility index corresponding to each characteristic pixel point set can be obtained.
Step S4, determining the edge confidence coefficient of each characteristic pixel point set based on the first probability index and the second probability index, and obtaining the adaptive superposition coefficient corresponding to each characteristic pixel point set based on the edge confidence coefficient; and obtaining an enhanced image based on the adaptive superposition coefficient and the gray scale image.
In the embodiment, a first likelihood index and a second likelihood index corresponding to each feature pixel point set are obtained in the steps, the first likelihood index characterizes the likelihood that the feature pixel point set is an aggregate edge from the angle of linear features, and the second likelihood index characterizes the likelihood that the feature pixel point set is an aggregate edge from the angle of gray features; based on this, the present embodiment will next determine the corresponding edge confidence based on the first likelihood indicator and the second likelihood indicator. For the first
Figure SMS_156
And the edge confidence coefficient of the feature pixel point set is obtained according to the first possibility index and the second possibility index corresponding to the feature pixel point set, namely:
Figure SMS_157
wherein ,
Figure SMS_158
is the first
Figure SMS_159
Edge confidence of the set of individual feature pixels,
Figure SMS_160
is the first
Figure SMS_161
First possibility corresponding to each characteristic pixel point setThe energy index of the energy is used for indicating the energy,
Figure SMS_162
is the first
Figure SMS_163
A second likelihood indicator corresponding to the set of feature pixels,
Figure SMS_164
is a normalization function.
Due to the first likelihood index
Figure SMS_165
The sign is a more important judgment condition, so in order to preserve the feature of the feature quantity positive and negative, the embodiment combines two likelihood indexes in a multiplying way; because the importance of the two likelihood indexes to the set of decision feature pixels is the same, the embodiment gives the two likelihood indexes the same weight value. When the first is
Figure SMS_166
The greater the edge confidence of each feature pixel point set, the description of the first
Figure SMS_167
The greater the likelihood that the region of the collection of individual feature pixels will be the edge of the aggregate.
By adopting the steps, the edge confidence of each characteristic pixel point set can be obtained, the texture on the surface of the building rubbish aggregate is not only a simple line texture, and the texture is more complex in structure, so that the corresponding shape characteristic is judged through the region formed by the characteristic pixel point set in the previous step, and when the region where the pixel point is positioned is not in an obvious edge linear shape, the obtained edge confidence is obtained
Figure SMS_168
The rest area is characterized based on the difference between the actual edge of the aggregate and the more regular line textures, so that the higher the obtained edge confidence is, the closer to the actual edge feature of the aggregate, and the obtained self-adaption isThe greater the coefficient of superposition, the greater the degree of enhancement that is ultimately obtained. For the first
Figure SMS_169
A set of feature pixels: when the edge confidence coefficient of the characteristic pixel point set is smaller than or equal to 0, taking a constant 1 as an adaptive superposition coefficient corresponding to the characteristic pixel point set; when the edge confidence coefficient of the feature pixel point set is larger than 0, calculating the sum of the constant 1 and the edge confidence coefficient of the feature pixel point set, and taking the sum as the self-adaptive superposition coefficient corresponding to the feature pixel point set. By adopting the method, the self-adaptive superposition coefficient corresponding to each characteristic pixel point set can be obtained, and the edge confidence coefficient of each pixel point is the same as the edge confidence coefficient of the set where the pixel point is positioned, so that the self-adaptive superposition coefficient of each characteristic pixel point is obtained, namely the self-adaptive superposition coefficient of each pixel point in the gray level image of the construction waste to be detected is obtained.
The method comprises the steps of performing low-pass filtering treatment on a gray level image of construction waste to be detected through a conventional anti-sharpening mask algorithm, processing the gray level image of the construction waste to be detected through a mean value filtering algorithm to obtain a blurry image after passivation, subtracting the gray level image of the construction waste to be detected from the blurry image to obtain an image retaining high-frequency information, multiplying different pixels in the high-frequency information image by corresponding adaptive superposition coefficients according to the adaptive superposition coefficients of each pixel in the gray level image of the construction waste to be detected, and thus obtaining an amplified adaptive high-frequency image, and superposing the amplified adaptive high-frequency image and the gray level image of the construction waste to be detected to obtain an enhanced image. The process of obtaining the enhanced image according to the superposition coefficients is the prior art, and will not be described in detail here.
By adopting the method, the enhancement processing of the gray level image of the construction waste to be detected is completed.
In the embodiment, the conventional unsharp masking algorithm is considered to blur an image by conventional mean filtering, the obtained enhancement result is used for enhancing the edge of the image and enhancing irrelevant factors such as partial noise, textures on the surface of aggregate and the like, the linear connected region is judged by taking the gradient direction as a judgment basis to obtain a plurality of characteristic pixel point sets, the edge confidence of each characteristic pixel point set is judged by combining the characteristics of the crushed aggregate of the construction waste, different superposition coefficients are set for different regions in the gray image of the construction waste to be detected based on the edge confidence, and compared with the conventional high-frequency image and the original image, the enhancement of the edge information between the aggregates is carried out, and meanwhile, the change of irrelevant regions such as textures, noise points and the like on the surface of the aggregate is small, so that the aim of only enhancing the edge of the aggregate is fulfilled, and the accuracy of aggregate detection can be effectively improved. In the embodiment, when the edge confidence of the feature pixel point set is determined, the feature pixel point set is analyzed from two angles of linear features and gray level features of the set, a first likelihood index corresponding to each feature pixel point set is obtained according to the degree of dispersion and the degree of irregularity of the gradient directions of the feature pixel points in each feature pixel point set, a second likelihood index corresponding to each feature pixel point set is obtained according to a target connected domain corresponding to each feature pixel point set and a corresponding gray level upper and lower limit value, the likelihood that the region formed by each feature pixel point set is an aggregate edge is determined according to the shape features of the aggregate edge by the first likelihood index, the likelihood that the region formed by each feature pixel point set is an aggregate edge is determined according to the gray level features of the aggregate surface by the second likelihood index, and the surface features of construction waste are comprehensively analyzed, so that the reliability of the determination result of the edge confidence is higher.

Claims (8)

1. The visual image enhancement processing method of the construction waste is characterized by comprising the following steps of:
acquiring a gray image of the construction waste to be detected;
obtaining each characteristic pixel point set according to the gradient direction of each pixel point in the gray level image and the gradient direction of the pixel point in the preset adjacent area of each pixel point; obtaining the degree of dispersion of the gradient direction of the characteristic pixel points in each characteristic pixel point set; obtaining the irregularity degree corresponding to each characteristic pixel point set according to the relative positions between the characteristic pixel points and the central lines of the corresponding characteristic pixel point sets; obtaining a first possibility index corresponding to each characteristic pixel point set based on the discrete degree and the irregular degree;
obtaining a target connected domain corresponding to each characteristic pixel point set and a corresponding gray upper and lower limit value based on the gray value of each characteristic pixel point in each characteristic pixel point set and the gray value of each pixel point in a preset neighborhood of each characteristic pixel point, and obtaining a second possibility index corresponding to each characteristic pixel point set according to the target connected domain and the gray upper and lower limit value;
determining edge confidence coefficient of each characteristic pixel point set based on the first probability index and the second probability index, and obtaining an adaptive superposition coefficient corresponding to each characteristic pixel point set based on the edge confidence coefficient; obtaining an enhanced image based on the adaptive superposition coefficients and the grayscale image;
Obtaining the irregularity degree corresponding to each feature pixel point set according to the relative positions between the center lines of each feature pixel point and the corresponding feature pixel point set, including:
for the first
Figure QLYQS_1
A set of feature pixels:
respectively calculate the first
Figure QLYQS_2
Every feature pixel point in every feature pixel point set is up to +.>
Figure QLYQS_3
The vertical distance of the center line of the feature pixel point set is calculated according to the vertical distance>
Figure QLYQS_4
Standard deviation of vertical distances from all feature pixel points in the feature pixel point set to the central line; respectively acquiring the features of the left side and the right side of the center point of the feature pixel point set, which are farthest from the center pointPixel point, calculating the distance between two characteristic pixel points and using the distance as the +.>
Figure QLYQS_5
The distance between the head end point and the tail end point of the characteristic pixel point set;
according to the first
Figure QLYQS_6
The number of the characteristic pixel points in the characteristic pixel point set, the +.>
Figure QLYQS_7
The distance between the head and tail end points of the feature pixel point set +.>
Figure QLYQS_8
Calculating standard deviation of vertical distances between all feature pixel points in the feature pixel point set and a central line, and calculating the +.>
Figure QLYQS_9
The irregularity degree corresponding to the characteristic pixel point set;
said first
Figure QLYQS_10
The acquisition process of the central line of each characteristic pixel point set comprises the following steps: get- >
Figure QLYQS_11
Coordinates of a center point of the set of feature pixel points; calculate->
Figure QLYQS_12
The average value of the gradient directions of all the feature pixels in the feature pixel point set is taken as the +.>
Figure QLYQS_13
Average gradient directions corresponding to the feature pixel point sets; too much->
Figure QLYQS_14
A central point of the feature pixel point set is used as a straight line perpendicular to the direction of the average gradient direction as a +.>
Figure QLYQS_15
Center lines of the feature pixel point sets;
calculate the first using the following formula
Figure QLYQS_16
The irregularity degree corresponding to each characteristic pixel point set is as follows:
Figure QLYQS_17
wherein ,
Figure QLYQS_19
is->
Figure QLYQS_22
The corresponding degree of irregularity of the feature pixel point sets, < >>
Figure QLYQS_25
Is->
Figure QLYQS_20
The number of feature pixels in the set of feature pixels, < >>
Figure QLYQS_23
Is->
Figure QLYQS_26
The distance between the head and tail end points of the set of feature pixels,/>
Figure QLYQS_27
Is->
Figure QLYQS_18
Standard deviation of vertical distance between all feature pixels in the feature pixel point set and central line, < +.>
Figure QLYQS_21
For a preset standard deviation threshold value, < >>
Figure QLYQS_24
Is a logarithmic function based on natural constants.
2. The method for enhancing a visual image of construction waste according to claim 1, wherein the obtaining each feature pixel point set according to a gradient direction of each pixel point in the gray scale image and a gradient direction of a pixel point in a preset neighborhood of each pixel point comprises:
Based on a region growing algorithm, each pixel point in the gray level image is used as an initial growing point for growing, and the growing conditions are as follows: judging whether the difference between the growth point and the gradient direction of each pixel point in the preset neighborhood is smaller than a gradient difference threshold value, and if so, taking the corresponding neighborhood pixel point as a new growth point;
and taking the set of all the growing points after the growth is completed as a characteristic pixel point set corresponding to the initial growing point.
3. The method for enhancing a visual image of construction waste according to claim 1, wherein the step of obtaining the degree of dispersion of the gradient direction of the feature pixel point in each feature pixel point set comprises:
according to the gradient directions of the characteristic pixel points in each characteristic pixel point set, calculating the standard deviation of the gradient directions of all the characteristic pixel points in each characteristic pixel point set, and taking the standard deviation as the discrete degree of the gradient directions of the characteristic pixel points in the corresponding characteristic pixel point set.
4. The method for enhancing a visual image of construction waste according to claim 1, wherein the obtaining a first likelihood index corresponding to each feature pixel point set based on the discrete degree and the irregular degree includes:
And calculating products of the irregularity degree and the discrete degree corresponding to each characteristic pixel point set, and taking the products as first possibility indexes corresponding to each characteristic pixel point set.
5. The method for enhancing a visual image of construction waste according to claim 1, wherein the obtaining the target connected domain and the corresponding upper and lower gray-scale limit values corresponding to each feature pixel point set based on the gray-scale values of each feature pixel point in each feature pixel point set and the gray-scale values of the pixel points in the preset neighborhood of each feature pixel point comprises:
for the first
Figure QLYQS_28
A set of feature pixels:
if at first
Figure QLYQS_29
Only one pixel point corresponding to the minimum gray value in the characteristic pixel point set is provided, and the first pixel point is added with the second pixel point>
Figure QLYQS_30
The pixel point corresponding to the minimum gray value in the characteristic pixel point set is used as an initial center point, and the initial center point is used as a starting point to conduct connected domain analysis, so that a target connected domain is obtained; if more than one pixel point corresponding to the minimum gray value in the feature pixel point set is used, each pixel point corresponding to the minimum gray value in the feature pixel point set is used as an initial center point, connected domain analysis is carried out by taking the initial center point as a starting point, a plurality of connected domains are obtained, the minimum gray value of the pixel points in each connected domain is used as a reference gray value, and the connected domain corresponding to the minimum reference gray value is used as a first component >
Figure QLYQS_31
A target connected domain corresponding to the characteristic pixel point set;
acquisition stationA minimum gray value of the pixel points in the target connected domain; calculate the first
Figure QLYQS_32
The difference value between the maximum gray value of the characteristic pixel points in the characteristic pixel point set and the minimum gray value of the pixel points in the target connected domain is used as the +.>
Figure QLYQS_33
And the corresponding gray scale upper and lower limit values are collected by the characteristic pixel points.
6. The method for enhancing a visual image of construction waste according to claim 1, wherein the obtaining the second probability index corresponding to each feature pixel point set according to the target connected domain and the gray upper and lower limit values comprises:
and calculating products of the gray upper and lower limit values corresponding to the characteristic pixel point sets and the number of the pixel points in the corresponding target connected domain, and taking the products as second possibility indexes corresponding to the characteristic pixel point sets.
7. The visual image enhancement processing method of construction waste according to claim 1, wherein the first likelihood index and the second likelihood index are both in positive correlation with edge confidence.
8. The method for enhancing a visual image of construction waste according to claim 1, wherein the obtaining the adaptive superposition coefficients corresponding to each feature pixel point set based on the edge confidence comprises:
For the first
Figure QLYQS_34
A set of feature pixels:
when the first is
Figure QLYQS_35
Edges of sets of individual feature pixelsWhen the confidence coefficient is smaller than or equal to 0, taking a constant 1 as an adaptive superposition coefficient corresponding to the characteristic pixel point set; when->
Figure QLYQS_36
When the edge confidence coefficient of each characteristic pixel point set is larger than 0, calculating the sum of a constant 1 and the edge confidence coefficient of the characteristic pixel point set, and taking the sum as the +.>
Figure QLYQS_37
And the self-adaptive superposition coefficients corresponding to the characteristic pixel point sets. />
CN202310300459.2A 2023-03-27 2023-03-27 Visual image enhancement processing method for construction waste Active CN116029941B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310300459.2A CN116029941B (en) 2023-03-27 2023-03-27 Visual image enhancement processing method for construction waste

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310300459.2A CN116029941B (en) 2023-03-27 2023-03-27 Visual image enhancement processing method for construction waste

Publications (2)

Publication Number Publication Date
CN116029941A CN116029941A (en) 2023-04-28
CN116029941B true CN116029941B (en) 2023-06-09

Family

ID=86089511

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310300459.2A Active CN116029941B (en) 2023-03-27 2023-03-27 Visual image enhancement processing method for construction waste

Country Status (1)

Country Link
CN (1) CN116029941B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117689552B (en) * 2024-02-02 2024-04-05 科普云医疗软件(深圳)有限公司 Coronary angiography enhancement method for intracardiac interventional therapy

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7529422B2 (en) * 2004-09-22 2009-05-05 Siemens Medical Solutions Usa, Inc. Gradient-based image restoration and enhancement
US8559746B2 (en) * 2008-09-04 2013-10-15 Silicon Image, Inc. System, method, and apparatus for smoothing of edges in images to remove irregularities
CN103761524B (en) * 2014-01-17 2016-11-16 电子科技大学 A kind of linear goal identification based on image and extracting method
CN110443806B (en) * 2019-04-30 2022-05-03 浙江大学 Water surface transparent floating hazardous chemical substance image segmentation method based on target enhancement processing
CN111307070B (en) * 2019-11-05 2021-06-15 长安大学 Method for measuring edge angle of concrete coarse aggregate based on digital image processing
CN115331119B (en) * 2022-10-13 2023-01-31 山东爱福地生物股份有限公司 Solid waste identification method
CN115345883B (en) * 2022-10-19 2023-03-24 元能微电子科技南通有限公司 PCB (printed circuit board) eccentric hole abnormality detection method based on linear gray level enhancement
CN115661187B (en) * 2022-12-12 2023-06-02 山东本草堂中药饮片有限公司 Image enhancement method for analysis of traditional Chinese medicine preparation

Also Published As

Publication number Publication date
CN116029941A (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN110443128B (en) Finger vein identification method based on SURF feature point accurate matching
WO2016172827A1 (en) Stepwise-refinement pavement crack detection method
CN116029941B (en) Visual image enhancement processing method for construction waste
CN101430763B (en) Detection method for on-water bridge target in remote sensing image
CN102169580B (en) Self-adaptive image processing method utilizing image statistic characteristics
CN104063866B (en) A kind of particle size detection method in ore transmit process
CN101996328B (en) Wood identification method
CN109377450A (en) A kind of edge-protected denoising method
Wei et al. Beamlet transform based pavement image crack detection
CN104463814A (en) Image enhancement method based on local texture directionality
CN107610147A (en) A kind of waste film Reinforced Aeolian Sand method for processing foundation
CN116342586B (en) Road surface quality detection method based on machine vision
CN116630813B (en) Highway road surface construction quality intelligent detection system
CN114863493B (en) Detection method and detection device for low-quality fingerprint image and non-fingerprint image
CN110874825B (en) Method for extracting binary image of water trace on surface of composite insulator
CN117197140A (en) Irregular metal buckle forming detection method based on machine vision
CN112528868A (en) Illegal line pressing judgment method based on improved Canny edge detection algorithm
CN116740054A (en) Tongue image tooth trace detection method based on image processing
CN115060754A (en) Surface quality detection method for stainless steel product
CN116630321B (en) Intelligent bridge health monitoring system based on artificial intelligence
CN112381844A (en) Self-adaptive ORB feature extraction method based on image blocking
CN109447952B (en) Semi-reference image quality evaluation method based on Gabor differential box weighting dimension
CN107480648B (en) Method for detecting characters in natural scene
CN104766279B (en) ScanSAR sea ice image incident angle effect is by class bearing calibration
CN114241303A (en) Drainage basin underlying surface feature extraction method based on computer vision technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant