CN106326901B - Water stain image-recognizing method and TEDS system based on marginal point self-similarity - Google Patents

Water stain image-recognizing method and TEDS system based on marginal point self-similarity Download PDF

Info

Publication number
CN106326901B
CN106326901B CN201610752438.4A CN201610752438A CN106326901B CN 106326901 B CN106326901 B CN 106326901B CN 201610752438 A CN201610752438 A CN 201610752438A CN 106326901 B CN106326901 B CN 106326901B
Authority
CN
China
Prior art keywords
marginal point
point
self
similarity
marginal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610752438.4A
Other languages
Chinese (zh)
Other versions
CN106326901A (en
Inventor
汪辉
任昌
杨仁兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Xinhe Electronic Technology Co Ltd
Original Assignee
Nanjing Xinhe Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Xinhe Electronic Technology Co Ltd filed Critical Nanjing Xinhe Electronic Technology Co Ltd
Priority to CN201610752438.4A priority Critical patent/CN106326901B/en
Publication of CN106326901A publication Critical patent/CN106326901A/en
Application granted granted Critical
Publication of CN106326901B publication Critical patent/CN106326901B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The present invention proposes the water stain image recognition based on marginal point self-similarity, obtains the marginal point of image to be detected, to each marginal point distribution reference direction and its feature vector is calculated, to feature vector normalized;The part of each marginal point and whole self-similarity value on edge line are calculated, using its weighted array as the final self-similarity value of the marginal point;The edge point set that self-similarity is high on each of the edges line is obtained, is screened in the set according to scheduled mode and marks the low broken edge point of self-similarity;Count the directional spreding histogram of broken edge point, determine the principal direction of each broken edge point, default downwardly direction threshold range rejects principal direction not in the broken edge point of direction threshold range, and the broken edge point by principal direction in direction threshold range is labeled as water stain image.The present invention also proposes the TEDS system using the water stain image recognition, effectively identifies water stain in EMU, reduction False Rate.

Description

Water stain image-recognizing method and TEDS system based on marginal point self-similarity
Technical field
The present invention relates to computer pictures to detect identification field, particularly relates to a kind of based on the water stain of marginal point self-similarity The EMU operation troubles Motion Image Detection system of image-recognizing method and the application identification.
Background technique
Water stain image is usually a kind of interference image in field of image recognition, the influence to the identification of specified features compared with Small, therefore, there is no be specifically directed to water stain knowledge method for distinguishing at present.
EMU fault detection is transformed into the analysis to vehicle image from the artificial detection at scene, is run using EMU Fault dynamic images detection system (TEDS) monitors EMU driving conditions in real time, i.e. TEDS installs in-orbit side and rail bottom To the position real-time image acquisition to be detected of the EMU of walking.TEDS failure automatic identifying method mainly utilizes comparison in difference Method, unfaulty conditions when a kind of mode of comparison in difference method is the not running that will be stored in the realtime graphic of acquisition and image library EMU standard picture carry out characteristics of image comparison;Another way is will to deposit in the realtime graphic of acquisition and image library The history image of the EMU of the recent unfaulty conditions of storage carries out the comparison of characteristics of image, by feature difference in two ways Apparent place, which marks, is.
Reference picture Precision criterion in above-mentioned standard image comparison method can be realized accurately sentencing for present image difference It is fixed, however influenced by factors water stain etc. on overhaul of train-set maintenance, natural aging, vehicle, this method is easily normal by car body Variation is mistaken for failure, improves False Rate;It can be effectively reduced car body normal variation bring failure in history image comparison method Erroneous judgement problem, however since history image is collection in worksite on the spot, it is influenced by Along Railway complex environment, reference picture is inadequate Accurately, also there is apparent failure erroneous judgement.
If the influence of water stain factor on EMU locomotive can be overcome in standard picture comparison method, failure mistake can be reduced Sentence, improves the accuracy rate of fault detection.
Summary of the invention
The present invention proposes a kind of water stain image-recognizing method based on marginal point self-similarity, can will be water stain on motor-car Image-recognizing method comes out, and solves the problems, such as that failure is judged by accident high in standard picture comparison method in the prior art.
The technical scheme of the present invention is realized as follows:
A kind of water stain image-recognizing method based on marginal point self-similarity, comprising the following steps:
Step 1: inputting image to be detected in a computer, and the institute of the image is obtained using canny edge detection algorithm There is marginal point;
Step 2: classifying to all marginal points, and similar marginal point belongs to an initial edge of an image outline Line obtains all initial edge lines of image to be detected, distributes a reference direction to each marginal point, and extract each The feature vector of marginal point, and each feature vector is normalized;
Step 3: each side on every initial edge line is calculated according to the feature vector after each marginal point normalized The local self-similarity value of edge point and whole self-similarity value, and by the weighting of local self-similarity value and whole self-similarity value Combine the self-similarity value final as the marginal point;
Step 4: one high threshold of setting obtains the side that the self-similarity value on every initial edge line is higher than high threshold The set of edge point rejects the marginal point that self-similarity value on every initial edge line is lower than high threshold;
Step 5: calculating the self-similarity value of each marginal point and its closest marginal point in above-mentioned set, sets one Low threshold obtains the marginal point that self-similarity value all in set is higher than Low threshold, and the side of Low threshold will be lower than in the set Edge point is labeled as forming the broken edge point of irregular image;
Step 6: classify to the marginal point for being higher than Low threshold in step 5, same class marginal point forms an image outline Calibration edge line, set a length threshold to all calibration edge line, obtain the calibration edge for being less than the length threshold Point on the calibration edge line is labeled as forming the broken edge point of irregular image by line;
Step 7: the directional spreding histogram of the broken edge point in statistic procedure five and step 6 determines each not The principal direction of regular marginal point presets a downwardly direction threshold range, rejects all principal directions not in direction threshold range Broken edge point of the principal direction in the threshold range of direction is labeled as water stain image by interior broken edge point.
Preferably, in the water stain image-recognizing method based on marginal point self-similarity, root in the step 3 The final self-similarity of the marginal point on every initial edge line is calculated according to the feature vector after each marginal point normalized The mode of value are as follows: setting PnAnd PmFor any two marginal point on edge line, feature vector is respectivelyWithThen
Any two marginal point P on edge linenAnd PmSimilarity are as follows:Here vector The calculation of inner product is that the multiplication of vector corresponding element is added again, obtains two marginal point PnAnd PmSimilarity;
Marginal point PnLocal self-similarity value are as follows: take marginal point PnOn the edge line at place it is adjacent with the marginal point and Four marginal point P positioned at its two sidesn-1、Pn-2、Pn+1、Pn+2, then marginal point PnLocal self-similarity value are as follows:
Marginal point PnWhole self-similarity value are as follows: it is assumed that sharing n marginal point, then marginal point P on the edge linenIt is whole Body self-similarity value are as follows:
The weight of local self-similarity and whole self-similarity is set as Wloc,Wglo, and Wloc+Wglo=1, Wloc,Wglo∈ [0,1] combines local self-similarity value and whole self-similarity value, then marginal point PnFinal self-similarity value are as follows:
Self_Sim(Pn)=Wloc·Self_Sim(Pn)_local+Wglo·Self_Sim(Pn)_global;
Feature vector after normalizedRange between 0 to 1, then self-similarity value indicates phase between 0 to 1 Like degree, it is complete similar state when being 1 that self-similarity value, which is complete dissimilar state when being 0,.
Preferably, it in the water stain image-recognizing method based on marginal point self-similarity, is given in the step 2 Each marginal point distributes the mode of a reference direction are as follows:
For any one marginal point, the local neighborhood centered on current edge point is constructed, all pictures in the neighborhood are calculated The gradient value of vegetarian refreshments and direction, gradient value and direction using all pixels point in the statistics with histogram neighborhood, the interior packet of histogram Containing 9 histogram columns for dividing equally the direction scope of 0~180 degree, 180~360 degree are divided equally and are merged on 9 histogram columns;
The weighting coefficient for calculating the two neighboring direction of each marginal point pair calculates each further according to weighting coefficient and gradient value Contribution weight is added to each histogram of the histogram where the marginal point by the contribution weight in the two neighboring direction of marginal point pair On column, histogram peak direction is the reference direction of the marginal point.
Preferably, in the water stain image-recognizing method based on marginal point self-similarity, appoint in the step 2 The extracting mode of the feature vector of one marginal point are as follows:
Any edge point is set as P0, the reference direction of the marginal point is θ, and reference axis is rotated to reference direction;It is revolving It is taken respectively along four orientation apart from marginal point P in coordinate system after turning0The point P of predetermined location of pixels1、P2、P3、P4, construction With P0、P1、P2、P3、P4Centered on 5 local neighborhoods, the gradient value for calculating each pixel and each pixel are to adjacent two The contribution weight in a direction;The directional spreding histogram for counting 5 local neighborhoods obtains 5 histogram hist (1), hist (2),hist(3),hist(4),hist(5);The feature vector of the marginal point are as follows: FeatureVector=[hist (1) hist (2)hist(3)hist(4)hist(5)];Finally the feature vector of each marginal point is normalized.
Preferably, in the water stain image-recognizing method based on marginal point self-similarity, each broken edge The method of determination of the principal direction of point are as follows:
For any one broken edge point Pt, the reference direction of the marginal point is θ, and reference axis is rotated to benchmark side To construction is with current edge point PtCentered on local neighborhood, calculate the direction of all pixels point in the neighborhood;Constructing one will The direction scope of 0~180 degree is divided into the column diagram of 18 histogram columns, and 180~360 degree are uniformly distributed and are merged into 18 histograms On column, the pixel number in 18 histogram columns in each histogram column direction scope is counted;The column diagram peak finally counted It is worth the principal direction that direction is current edge point.
The invention has the benefit that first considering every initial edge line in the present invention, calculating every according to feature vector The self-similarity of marginal point on edge line obtains the biggish edge point set of self similarity, rejects the small marginal point of self-similarity; Recalculate the self-similarity of each marginal point according to scheduled mode in the biggish all edge point sets of self-similarity again, It is obtained from the edge point set that similarity is higher than Low threshold, increases the continuity of marginal point;To being higher than Low threshold in set Marginal point reclassifies, and obtains calibration edge line;It is low lower than being lower than in the point of length threshold and set that edge line length will be corrected The point of threshold value is labeled as broken edge point;According to the water stain characteristic having, water stain line orientations substantially downwardly, to never advise Water stain image is then identified in image.It applies the method on EMU operation troubles Motion Image Detection system, it can be effective It identifies the water stain image on vehicle, reduces EMU failure False Rate, improve the inspection of EMU operation troubles Motion Image Detection system The accuracy of survey.
Detailed description of the invention
To describe the technical solutions in the embodiments of the present invention more clearly, make required in being described below to embodiment Attached drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the invention, for For those of ordinary skill in the art, without any creative labor, other can also be obtained according to embodiment Attached drawing.
Fig. 1 is the image at a certain position of the existing EMU of acquisition;
Fig. 2 is the water stain image identified in Fig. 1.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
Embodiment: it is a kind of apply in EMU operation troubles Motion Image Detection system (TEDS system) based on edge The water stain image-recognizing method of point self-similarity, comprising the following steps:
Step 1: inputting EMU image to be detected in a computer, as shown in Figure 1, being calculated using canny edge detection Method obtains all marginal points of the image;Detailed process is as follows:
1, on computers by EMU image procossing be gray level image;
2, Gaussian Blur is carried out to reduce the interference of picture noise to gray level image;
3, the gradient value of each pixel and direction in the image after calculating noise reduction;
4, non-maxima suppression is carried out to the gradient value of each pixel, tentatively obtains image border point set;
5, edge connection is carried out using dual threshold method, rejects false edge, completion emargintion obtains more
Accurate edge point set.
Step 2: classifying to all marginal points, and similar marginal point belongs to an initial edge of an image outline Line obtains all initial edge lines of image to be detected, distributes a reference direction to each marginal point, and extract every The feature vector of a marginal point, and the feature vector of each marginal point is normalized;
Each marginal point distributes the mode of a reference direction are as follows:
1, for any one marginal point, the 8*8 neighborhood centered on current edge point is constructed, all pictures in the neighborhood are calculated The gradient value of vegetarian refreshments and direction, gradient value and direction using all pixels point in the statistics with histogram neighborhood, the interior packet of histogram Containing 9 histogram columns for dividing equally the direction scope of 0~180 degree, 20 degree of each histogram column, 180~360 degree are divided equally and are merged 9 On a histogram column;
2, the weighting coefficient for calculating each two neighboring direction of marginal point pair calculates every further according to weighting coefficient and gradient value It is straight to be added to each of histogram where the marginal point by the contribution weight in a two neighboring direction of marginal point pair for contribution weight On square column, histogram peak direction is the reference direction of the marginal point.
The extracting mode of the feature vector of each marginal point are as follows:
1, any edge point is set as P0, the reference direction of the marginal point is θ, and reference axis is rotated to reference direction, is sat Mark is transformed toIt is taken respectively along four orientation apart from marginal point in coordinate system after rotation P0The point P of predetermined location of pixels1、P2、P3、P4, construct with P0、P1、P2、P3、P4Centered on 5 8*8 neighborhoods, calculate each Gradient value m (x ', y ')=sqrt (dx ' * dx '+dy ' * dy ') of pixel, calculates each pixel to two neighboring direction Contribute weight w1=m (x ', y ') * do, w2=m (x ', y ') * (1-do);
2, the directional spreding histogram for counting 5 8*8 neighborhoods, obtains 5 histogram hist (1), hist (2), hist (3),hist(4),hist(5);The feature vector of the marginal point are as follows: FeatureVector=[hist (1) hist (2) hist (3)hist(4)hist(5)];Finally the feature vector of each marginal point is normalized.
Step 3: each side on every initial edge line is calculated according to the feature vector after each marginal point normalized The local self-similarity value of edge point and whole self-similarity value, and by the weighting of local self-similarity value and whole self-similarity value Combine the self-similarity value final as marginal point;
The calculation of the final self-similarity value of marginal point on every initial edge line are as follows: setting PnAnd PmFor side Any two marginal point on edge line, feature vector are respectivelyWithThen
Any two marginal point P on edge linenAnd PmSimilarity are as follows:Here to The calculation of amount inner product is that the multiplication of vector corresponding element is added again, obtains two marginal point PnAnd PmSimilarity;
Marginal point PnLocal self-similarity value are as follows: take marginal point PnOn the edge line at place it is adjacent with the marginal point and Four marginal point P positioned at its two sidesn-1、Pn-2、Pn+1、Pn+2, then marginal point PnLocal self-similarity value are as follows:
Marginal point PnWhole self-similarity value are as follows: it is assumed that sharing n marginal point, then marginal point P on the edge linenIt is whole Body self-similarity value are as follows:
The weight of local self-similarity and whole self-similarity is set as Wloc,Wglo, choose Wloc=0.8, Wglo=0.2, Wloc,Wglo∈ [0,1] combines local self-similarity value and whole self-similarity value, then marginal point PnFinal self-similarity Value are as follows:
Self_Sim(Pn)=Wloc·Self_Sim(Pn)_local+Wglo·Self_Sim(Pn)_global;
Feature vector after normalizedRange between 0 to 1, then self-similarity value indicates phase between 0 to 1 Like degree, it is complete similar state when being 1 that self-similarity value, which is complete dissimilar state when being 0,.
Step 4: being set as 0.7 for high threshold, obtains the edge that the self-similarity value on every initial edge line is higher than 0.7 The set of point rejects the marginal point that self-similarity value on every initial edge line is lower than high threshold.
Step 5: calculating the self-similarity of any two marginal point in the set of step 4, Low threshold be set as 0.2, The marginal point that self-similarity value all in set is higher than 0.2 is obtained, the marginal point that 0.2 is lower than in the set is labeled as being formed The broken edge point of irregular image.
Step 6: classify to the marginal point for being higher than Low threshold in step 5, same class marginal point forms an image outline Calibration edge line, set length threshold 10 to all calibration edge line, obtain calibration edge of the edge line length less than 10 Point on the calibration edge line is labeled as forming the broken edge point of irregular image by line.
Step 7: the directional spreding histogram of the broken edge point in statistic procedure five and step 6, determines each not The principal direction of regular marginal point presets 70~110 degree of downwardly direction threshold range, rejects all principal directions not 70~110 Broken edge point of the principal direction in 70~110 degree is labeled as water stain image, the principal direction by the broken edge point in degree Broken edge point it is as shown in Figure 2;
The method of determination of the principal direction of each broken edge point are as follows: for any one broken edge point Pt, the edge The reference direction of point is θ, and reference axis is rotated to reference direction, is constructed with current edge point PtCentered on 8*8 neighborhood, calculate The direction of all pixels point in the neighborhood;The direction scope of 0~180 degree is divided into the cylindricality of 18 histogram columns by building one Figure, 180~360 degree are uniformly distributed and are merged on 18 histogram columns, and each histogram column is 10 degree, count every in 18 histogram columns Pixel number in a histogram column direction scope;The peak value direction of the column diagram finally counted is current edge point Principal direction.
Step 8: the reference picture of the EMU to be detected in image library is extracted, using standard picture method in EMU Image to be detected and reference picture are compared in operation troubles Motion Image Detection system, by what is identified in step 7 Water stain image is defaulted as external disturbing factor, is not the malfunction of EMU, does not mark in EMU fault detection, reduces Failure erroneous judgement, the accuracy of the fault detection figure improved.
The neighborhood of above-mentioned appearance selects as the case may be, is also chosen as other neighborhoods such as 8*16.It is above-mentioned high threshold, low Threshold value, length threshold and direction threshold range can be chosen according to the type of actually detected image.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all in essence of the invention Within mind and principle, any modification, equivalent replacement, improvement and so on be should all be included in the protection scope of the present invention.

Claims (6)

1. a kind of water stain image-recognizing method based on marginal point self-similarity, which comprises the following steps:
Step 1: inputting image to be detected in a computer, and all sides of the image are obtained using canny edge detection algorithm Edge point;
Step 2: classifying to all marginal points, and similar marginal point belongs to an initial edge line of an image outline, obtains All initial edge lines of image to be detected are taken, distribute a reference direction to each marginal point, and extract each edge The feature vector of point, and each feature vector is normalized;
Step 3: each marginal point on every initial edge line is calculated according to the feature vector after each marginal point normalized Local self-similarity value and whole self-similarity value, and by the weighted array of local self-similarity value and whole self-similarity value The self-similarity value final as the marginal point;
Step 4: one high threshold of setting obtains the marginal point that the self-similarity value on every initial edge line is higher than high threshold Set, reject every initial edge line on self-similarity value be lower than high threshold marginal point;
Step 5: calculating the self-similarity value of each marginal point and its closest marginal point in above-mentioned set, sets a low threshold Value obtains the marginal point that self-similarity value all in set is higher than Low threshold, and the marginal point of Low threshold will be lower than in the set Labeled as the broken edge point for forming irregular image;
Step 6: classify to the marginal point for being higher than Low threshold in step 5, same class marginal point forms the school of an image outline Positive edge line sets a length threshold to all calibration edge lines, obtains the calibration edge line for being less than the length threshold, will Point on the calibration edge line is labeled as forming the broken edge point of irregular image;
Step 7: the directional spreding histogram of the broken edge point in statistic procedure five and step 6 determines each irregular The principal direction of marginal point presets a downwardly direction threshold range, rejects all principal directions not in the threshold range of direction Broken edge point of the principal direction in the threshold range of direction is labeled as water stain image by broken edge point.
2. the water stain image-recognizing method according to claim 1 based on marginal point self-similarity, which is characterized in that described The marginal point calculated on every initial edge line in step 3 according to the feature vector after each marginal point normalized is final Self-similarity value mode are as follows: setting PnAnd PmFor any two marginal point on edge line, feature vector is respectivelyWithThen
Any two marginal point P on edge linenAnd PmSimilarity are as follows:Here inner product of vectors Calculation be vector corresponding element multiplication be added again, obtain two marginal point PnAnd PmSimilarity;
Marginal point PnLocal self-similarity value are as follows: take marginal point PnIt is adjacent with the marginal point and be located on the edge line at place Four marginal point P of its two sidesn-1、Pn-2、Pn+1、Pn+2, then marginal point PnLocal self-similarity value are as follows:
Marginal point PnWhole self-similarity value are as follows: it is assumed that sharing n marginal point, then marginal point P on the edge linenIt is whole from Similarity are as follows:
The weight of local self-similarity and whole self-similarity is set as Wloc,Wglo, and Wloc+Wglo=1, Wloc,Wglo∈[0, 1], combine local self-similarity value and whole self-similarity value, then marginal point PnFinal self-similarity value are as follows:
Self_Sim(Pn)=Wloc·Self_Sim(Pn)_local+Wglo·Self_Sim(Pn)_global;
Feature vector after normalizedRange between 0 to 1, then self-similarity value indicates similar journey between 0 to 1 Degree, self-similarity value are complete dissimilar state when being 0, are complete similar states when being 1.
3. the water stain image-recognizing method according to claim 1 based on marginal point self-similarity, which is characterized in that described The mode of a reference direction is distributed in step 2 to each marginal point are as follows:
For any one marginal point, the local neighborhood centered on current edge point is constructed, all pixels point in the neighborhood is calculated Gradient value and direction, gradient value and direction using all pixels point in the statistics with histogram neighborhood, comprising by 0 in histogram 9 histogram columns that the direction scope of~180 degree is divided equally, 180~360 degree are divided equally and are merged on 9 histogram columns;
The weighting coefficient for calculating the two neighboring direction of each marginal point pair calculates each edge further according to weighting coefficient and gradient value Contribution weight is added to each histogram column of the histogram where the marginal point to the contribution weight in two neighboring direction by point On, histogram peak direction is the reference direction of the marginal point.
4. the water stain image-recognizing method according to claim 1 based on marginal point self-similarity, which is characterized in that described The extracting mode of the feature vector of any of step 2 marginal point are as follows:
Any edge point is set as P0, the reference direction of the marginal point is θ, and reference axis is rotated to reference direction;After rotation It is taken respectively along four orientation apart from marginal point P in coordinate system0The point P of predetermined location of pixels1、P2、P3、P4, construct with P0、 P1、P2、P3、P4Centered on 5 local neighborhoods, the gradient value for calculating each pixel and each pixel are to two neighboring side To contribution weight;The directional spreding histogram for counting 5 local neighborhoods, obtain 5 histogram hist (1), hist (2), hist(3),hist(4),hist(5);The feature vector of the marginal point are as follows: FeatureVector=[hist (1) hist (2) hist(3)hist(4)hist(5)];Finally the feature vector of each marginal point is normalized.
5. the water stain image-recognizing method according to claim 1 based on marginal point self-similarity, which is characterized in that each The method of determination of the principal direction of broken edge point are as follows:
For any one broken edge point Pt, the reference direction of the marginal point is θ, and reference axis is rotated to reference direction, construction With current edge point PtCentered on local neighborhood, calculate the direction of all pixels point in the neighborhood;One is constructed by 0~180 The direction scope of degree is divided into the column diagram of 18 histogram columns, and 180~360 degree are uniformly distributed and are merged on 18 histogram columns, Count the pixel number in 18 histogram columns in each histogram column direction scope;Where the column diagram peak value finally counted Direction is the principal direction of current edge point.
6. a kind of EMU operation troubles Motion Image Detection system, which is characterized in that including appointing described in claim 1 to 5 One water stain image-recognizing method based on marginal point self-similarity, the water stain normal shape for being defaulted as EMU that will identify that State is not marked in EMU fault detection.
CN201610752438.4A 2016-08-30 2016-08-30 Water stain image-recognizing method and TEDS system based on marginal point self-similarity Active CN106326901B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610752438.4A CN106326901B (en) 2016-08-30 2016-08-30 Water stain image-recognizing method and TEDS system based on marginal point self-similarity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610752438.4A CN106326901B (en) 2016-08-30 2016-08-30 Water stain image-recognizing method and TEDS system based on marginal point self-similarity

Publications (2)

Publication Number Publication Date
CN106326901A CN106326901A (en) 2017-01-11
CN106326901B true CN106326901B (en) 2019-06-14

Family

ID=57789936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610752438.4A Active CN106326901B (en) 2016-08-30 2016-08-30 Water stain image-recognizing method and TEDS system based on marginal point self-similarity

Country Status (1)

Country Link
CN (1) CN106326901B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109063764A (en) * 2018-07-26 2018-12-21 福建和盛高科技产业有限公司 A kind of judgment method of disconnecting switch closing operation in place based on machine vision
US11023770B2 (en) 2019-09-23 2021-06-01 Hong Kong Applied Science And Technology Research Institute Co., Ltd. Systems and methods for obtaining templates for tessellated images
CN110827220B (en) * 2019-11-01 2023-05-30 杭州当虹科技股份有限公司 Anti-aliasing method based on image edge analysis

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1107179A2 (en) * 1999-11-29 2001-06-13 Eastman Kodak Company Method for detecting sky in images
CN101789005A (en) * 2010-01-22 2010-07-28 深圳创维数字技术股份有限公司 Image searching method based on region of interest (ROI)
CN102589808A (en) * 2012-01-16 2012-07-18 苏州临点三维科技有限公司 Large-scale tunnel seepage point measuring method
CN105551035A (en) * 2015-12-09 2016-05-04 深圳市华和瑞智科技有限公司 Stereoscopic vision matching method based on weak edge and texture classification
CN105844278A (en) * 2016-04-15 2016-08-10 浙江理工大学 Multi-feature fused fabric scanning pattern recognition method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1107179A2 (en) * 1999-11-29 2001-06-13 Eastman Kodak Company Method for detecting sky in images
CN101789005A (en) * 2010-01-22 2010-07-28 深圳创维数字技术股份有限公司 Image searching method based on region of interest (ROI)
CN102589808A (en) * 2012-01-16 2012-07-18 苏州临点三维科技有限公司 Large-scale tunnel seepage point measuring method
CN105551035A (en) * 2015-12-09 2016-05-04 深圳市华和瑞智科技有限公司 Stereoscopic vision matching method based on weak edge and texture classification
CN105844278A (en) * 2016-04-15 2016-08-10 浙江理工大学 Multi-feature fused fabric scanning pattern recognition method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《图像的局部自相似性用于边缘检测的方法研究》;张正炳 等;;《华中理工大学学报》;19961031;第24卷(第10期);第53-55页

Also Published As

Publication number Publication date
CN106326901A (en) 2017-01-11

Similar Documents

Publication Publication Date Title
US11580647B1 (en) Global and local binary pattern image crack segmentation method based on robot vision
CN109087510B (en) Traffic monitoring method and device
CN110348297B (en) Detection method, system, terminal and storage medium for identifying stereo garage
CN106682586A (en) Method for real-time lane line detection based on vision under complex lighting conditions
CN112084869B (en) Compact quadrilateral representation-based building target detection method
CN115018828A (en) Defect detection method for electronic component
CN106897681B (en) Remote sensing image contrast analysis method and system
CN106326901B (en) Water stain image-recognizing method and TEDS system based on marginal point self-similarity
CN106778633B (en) Pedestrian identification method based on region segmentation
CN105205486A (en) Vehicle logo recognition method and device
CN104809433A (en) Zebra stripe detection method based on maximum stable region and random sampling
CN112036385B (en) Library position correction method and device, electronic equipment and readable storage medium
CN102867188A (en) Method for detecting seat state in meeting place based on cascade structure
CN113781482B (en) Method and system for detecting crack defects of mechanical parts in complex environment
Maček et al. A lane detection vision module for driver assistance
CN107273802A (en) A kind of detection method and device of railroad train brake shoe drill ring failure
CN103985106B (en) Apparatus and method for carrying out multiframe fusion to very noisy image
CN114820625A (en) Automobile top block defect detection method
CN116109986A (en) Vehicle track extraction method based on laser radar and video technology complementation
CN107977608B (en) Method for extracting road area of highway video image
Ying et al. An illumination-robust approach for feature-based road detection
CN116596428B (en) Rural logistics intelligent distribution system based on unmanned aerial vehicle
CN109784229B (en) Composite identification method for ground building data fusion
CN106384358B (en) The recognition methods of irregular image based on marginal point self-similarity
CN106327499B (en) The identification of greasy dirt image based on marginal point self-similarity and TEDS systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant