CN110763681B - Pantograph abrasion area positioning detection method - Google Patents

Pantograph abrasion area positioning detection method Download PDF

Info

Publication number
CN110763681B
CN110763681B CN201910783246.3A CN201910783246A CN110763681B CN 110763681 B CN110763681 B CN 110763681B CN 201910783246 A CN201910783246 A CN 201910783246A CN 110763681 B CN110763681 B CN 110763681B
Authority
CN
China
Prior art keywords
image
value
gray
formula
follows
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910783246.3A
Other languages
Chinese (zh)
Other versions
CN110763681A (en
Inventor
邢宗义
周祉慧
张永
徐文
杨行
杨双艳
从光涛
章加兵
杨斌辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN201910783246.3A priority Critical patent/CN110763681B/en
Publication of CN110763681A publication Critical patent/CN110763681A/en
Application granted granted Critical
Publication of CN110763681B publication Critical patent/CN110763681B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8861Determining coordinates of flaws
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Abstract

The invention discloses a positioning detection method for a wear area of a pantograph. The method comprises the following steps: firstly, acquiring image data on site, preprocessing the acquired data on site, and eliminating noise interference; then carrying out gray processing on the preprocessed image, and processing the three-color channel image into a channel gray image; then according to the gray level difference of the image pixels, the contrast of the image is enhanced, the light and shade relation is strengthened, and the target detection image is separated from the background; and finally, determining the characteristic position by using a hierarchical gray quantization algorithm, and positioning a target detection area. The method is used for positioning and detecting the wear area of the pantograph and has the advantages of low requirement on image data quantity, accurate detection result and strong applicability.

Description

Pantograph abrasion area positioning detection method
Technical Field
The invention belongs to the technical field of traffic safety engineering, and particularly relates to a pantograph abrasion area positioning detection method.
Background
In recent years, rail transit in China is greatly developed, and the operation mileage and construction mileage of the rail transit are continuously improved, so that the problems of operation control, driving safety, real-time monitoring and fault diagnosis of the operation state, high speed and high efficiency of fault maintenance of rail transit vehicles and the like become more important.
The train pantograph is a device for receiving current from a contact network of an electric locomotive of an electrified railway, and a slide plate strip of the train pantograph is directly contacted with a contact network lead and receives the current from the contact network lead for the locomotive to use. The condition of the pantograph directly influences the safe and reliable operation of the train, and the fault of the pantograph can cause the transportation interruption. Because the pantograph of the train is always in a high-speed motion state and is influenced by the structure of the pantograph, various faults are easy to occur in the running process. If slight damage of the pantograph is not discovered in time, the damage is continuously expanded, and finally the pantograph-catenary fault is caused; on the other hand, if the catenary is slightly damaged, the pantograph may be damaged. Through the real-time monitoring to the pantograph, can in time, accurately discover the region that the contact net goes wrong, will be favorable to carrying out timely maintenance, maintenance to the contact net. Therefore, if the pantograph can be accurately monitored in real time, the occurrence probability of faults of the pantograph and a contact network can be greatly reduced.
The detection of the bow net by using digital image acquisition and processing technology is a trend that is continuously developing. The image detection technology can simultaneously detect various pantograph net parts by utilizing single equipment, and has the characteristics of non-contact, high equipment intelligence degree, flexible design size, low cost, high efficiency and the like. In the existing pantograph detection means, rough positioning is needed to position the approximate area of the pantograph; and positioning the specific area of the pantograph through fine positioning. After the two sets of parameters are matched, the accurate position of the pantograph can be located only by twice positioning, the efficiency is not high, and more field experience is needed for adjusting the parameters.
Disclosure of Invention
The invention aims to provide a high-precision and high-speed pantograph abrasion area positioning detection method.
The technical solution for realizing the purpose of the invention is as follows: a pantograph abrasion area positioning detection method comprises the following steps:
step 1, acquiring image data on site, and performing Gaussian filtering processing on the image;
step 2, carrying out graying processing on the filtered image;
step 3, according to the gray level difference of the image pixels, contrast enhancement is carried out on the image;
step 4, determining a characteristic position by using a grading gray quantization algorithm;
and 5, positioning a target detection area and determining a pantograph abrasion area.
Further, the graying processing is performed on the filtered image in the step 2, which specifically includes:
step 2.1, the image f (x, y) is composed of pixel points, and the formula is as follows:
Figure GDA0003573381280000021
in the formula, an image f (x, y) is a picture total pixel set, i, j represents a row coordinate and a column coordinate of a logic arrangement position of each pixel point, and i belongs to (0, x-1), and j belongs to (0, y-1);
step 2.2, the three-channel brightness maximum value of each pixel point in the color image is taken as a gray value to be processed, and the formula is as follows:
f′(i,j)=max(R(i,j),G(i,j),B(i,j))
in the formula, f' (i, j) is a single-channel gray pixel of each point, i, j represents a row coordinate and a column coordinate of a logic arrangement position of a pixel point, and R (i, j), G (i, j) and B (i, j) respectively represent a red channel value, a green channel value and a blue channel value with a coordinate position of (i, j);
Step 2.3, obtaining a single-channel pixel gray value of the picture, wherein the formula is as follows:
f′(x,y)=[f′(i,j)],i∈(0,x-1),j∈(0,y-1)
further, according to the gray level difference of the pixels of the image, the contrast of the image is enhanced in step 3, which specifically includes the following steps:
step 3.1, the gradient vector of any coordinate point (i, j) of the input image f (x, y) is as follows:
Figure GDA0003573381280000022
in the formula (I), the compound is shown in the specification,
Figure GDA0003573381280000023
a gradient vector of f (i, j);
step 3.2 gradient vector
Figure GDA0003573381280000024
Component G on rows and columnsiAnd GjComprises the following steps:
Figure GDA0003573381280000025
Figure GDA0003573381280000026
step 3.3, the gradient vector obtained
Figure GDA0003573381280000027
Establishing a contrast model l (i, j, m, σ):
Figure GDA0003573381280000028
wherein, sigma is the standard deviation of the picture pixel value, and m is the light and shade contrast coefficient;
and 3.4, multiplying by using the contrast model, and then remapping to obtain an image with enhanced contrast, wherein the formula is as follows:
P(i,j)=(li,j)α
in the formula, alpha is a forward enhancement coefficient, and an optimal value is obtained according to an empirical method; li,jThe pixel value of the point (i, j) is calculated by a contrast model to obtain an enhanced value of the point; p (i, j) represents the enhancement value li,jThe value at coordinate (i, j) after remapping.
Further, the step 4 of determining the feature position by using a gray scale quantization algorithm specifically includes:
step 4.1, grading according to the gray value, firstly counting the range interval where the gray value at the image (i, j) is located, and then replacing the gray value with the value of the upper interval, wherein the formula is as follows:
P′(i,j)=n2,n1<P(i,j)<<n2
g(i,j)=P′(i,j),i∈(0,x-1),j∈(0,y-1)
Where g (i, j) is the transformed value and P' (i, j) is the transformed component of P (i, j), where n1 is the lower interval value, n2 is the upper interval value, and n1 ∈ [0,255], n2 ∈ [0,255 ];
step 4.2, summing the components in rows to obtain F (x),
Figure GDA0003573381280000031
the gray scale of each line is obtained, and the formula is as follows:
Figure GDA0003573381280000032
wherein F (x) is the sum of the components g (i, j) in rows,
Figure GDA0003573381280000033
to find the sum of the gray levels of each row.
Further, the positioning of the target detection area in step 5 is specifically as follows:
and (3) carrying out derivation on the gray data matrix, namely derivation in the x direction, wherein the formula is as follows:
Figure GDA0003573381280000034
in the formula (I), the compound is shown in the specification,
Figure GDA0003573381280000041
deriving a set of derivatives for the x-direction for the gray scale data matrix; bonding of
Figure GDA0003573381280000042
And F (x), determining the characteristic points, and positioning the target detection area.
Compared with the prior art, the invention has the remarkable advantages that: (1) by fitting the change trend among the regions, the regions are divided, so that the interference is effectively eliminated before the combination of the multiple regions, the actual interference problem can be effectively solved by only adjusting one set of parameters, and the detection efficiency is greatly improved; (2) the image is filtered, so that errors caused by noise of the image are reduced; the gray picture is used for processing, so that the influence of color is reduced, and the fault tolerance rate is improved; (3) the image is enhanced, so that the background is more obviously separated from the pantograph; (4) and the characteristic points are obtained by utilizing a hierarchical gray quantization algorithm, so that the target pantograph detection area is positioned, the applicability is strong, and the accuracy is high.
Drawings
Fig. 1 is a schematic flow chart of a pantograph worn area positioning detection algorithm according to the present invention.
Fig. 2 is an original view of a field-acquired pantograph left half-bow in an embodiment of the present invention.
Fig. 3 is a schematic diagram of a field-collected pantograph left half pantograph after gaussian filtering according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a field-acquired pantograph left half pantograph with contrast enhancement according to an embodiment of the present invention.
Fig. 5 is a schematic diagram of a left half pantograph of a field-acquisition pantograph after being processed by a gray scale algorithm in the embodiment of the invention.
FIG. 6 is a graph of data sets by row coordinate determination in an embodiment of the present invention.
FIG. 7 is a graph of row coordinate derived data and a first derivative set in accordance with an embodiment of the present invention.
Fig. 8 is a schematic diagram of a positioning detection result of a worn area in an embodiment of the present invention.
Detailed Description
With reference to fig. 1, the method for positioning and detecting the wear area of the pantograph of the present invention includes the following steps:
step 1, acquiring image data on site, and performing Gaussian filtering processing on an image, wherein the method specifically comprises the following steps:
the image data acquired by the field data is subjected to Gaussian filtering, and the template coefficient of the Gaussian filter is reduced along with the increase of the distance from the center of the template, so that the Gaussian filter has a small degree of blurring relative to the image, can effectively inhibit noise, and can keep the structure of the image approximately unchanged to smooth the image.
Step 2, performing graying processing on the filtered image, which specifically comprises the following steps:
step 2.1, the image f (x, y) is composed of pixel points, and the formula is as follows:
Figure GDA0003573381280000043
in the formula, an image f (x, y) is a picture total pixel set, i, j represents a row coordinate and a column coordinate of a logic arrangement position of each pixel point, and i belongs to (0, x-1), and j belongs to (0, y-1);
the picture can present an independent dot matrix after being amplified, each dot is a pixel point, and different colors can be displayed through the color proportion of RGB;
step 2.2, the three-channel brightness maximum value of each pixel point in the color image is taken as a gray value to be processed, and the formula is as follows:
f′(i,j)=max(R(i,j),G(i,j),B(i,j))
in the formula, f' (i, j) is a single-channel gray pixel of each point, i, j represents a row coordinate and a column coordinate of a logic arrangement position of a pixel point, and R (i, j), G (i, j) and B (i, j) respectively represent a red channel value, a green channel value and a blue channel value with a coordinate position of (i, j);
processing an image, namely processing pixel points, and separating a multi-channel image into a plurality of single-channel images or synthesizing the single-channel images into a multi-channel image so as to facilitate image processing; in the acquired field picture data, the brightness difference between the pantograph and the background is larger, and the gray processing is carried out through the maximum brightness value to achieve a better processing effect;
Step 2.3, obtaining a single-channel pixel gray value of the picture, wherein a formula is as follows:
f′(x,y)=[f′(i,j)],i∈(0,x-1),j∈(0,y-1)
step 3, according to the gray level difference of the image pixel, contrast enhancement is carried out on the image, which specifically comprises the following steps:
step 3.1, the gradient vector of any coordinate point (i, j) of the input image f (x, y) is as follows:
Figure GDA0003573381280000051
in the formula (I), the compound is shown in the specification,
Figure GDA0003573381280000052
a gradient vector of f (i, j);
step 3.2 gradient vector
Figure GDA0003573381280000053
Component G on rows and columnsiAnd GjComprises the following steps:
Figure GDA0003573381280000054
Figure GDA0003573381280000055
step 3.3, the gradient vector obtained
Figure GDA0003573381280000056
Establishing a contrast model l (i, j, m, σ):
Figure GDA0003573381280000057
in the formula, sigma is the standard deviation of the pixel value of the picture, m is a light and shade contrast coefficient, and the higher the m coefficient is in a certain range, the stronger the picture contrast improvement capability is;
and 3.4, multiplying by using the contrast model, and then remapping to obtain an image with enhanced contrast, wherein the formula is as follows:
P(i,j)=(li,j)α
in the formula, alpha is a forward enhancement coefficient, and an optimal value is obtained according to an empirical method; li,jIs the image of (i, j) pointThe prime value is calculated by a contrast model to obtain an enhanced value of the point; p (i, j) represents the enhancement value li,jThe value at coordinate (i, j) after remapping;
the original pantograph images are processed or each pixel point is in a 0-255 pixel interval, and the images can obtain standard images in the same form as the original pixels after the image normalization processing of parameters, so that the linear property of the images is kept and the original standard format is kept.
Step 4, determining the characteristic position by using a grading gray quantization algorithm, which comprises the following specific steps:
the gray difference between the image target and the background is filtered and denoised, and the interference of the gray difference of the pixels is small; the contrast of the enhanced image is large, the gray level difference is obvious, and the gray level of the image is graded and quantized by utilizing the characteristic.
Step 4.1, grading according to the gray value, firstly counting the range interval where the gray value at the image (i, j) is located, and then replacing the gray value with the value of the upper interval, wherein the formula is as follows:
P′(i,j)=n2,n1<P(i,j)<<n2
g(i,j)=P′(i,j),i∈(0,x-1),j∈(0,y-1)
where g (i, j) is the transformed value and P' (i, j) is the transformed component of P (i, j), where n1 is the lower interval value, n2 is the upper interval value, and n1 ∈ [0,255], n2 ∈ [0,255 ];
step 4.2, summing up according to the row of each component to obtain F (x),
Figure GDA0003573381280000061
the gray scale of each line is obtained, and the formula is as follows:
Figure GDA0003573381280000062
wherein F (x) is the sum of the components g (i, j) in rows,
Figure GDA0003573381280000063
to find the sum of the gray levels of each row.
The corresponding continuous brightness change interval on the sampling point of the image data is divided, and each interval is converted into a single specific pixel value, namely the quantization process. The more the quantization levels are, the richer the image levels are, and the larger the data size is; the less the quantization level, the less rich the image hierarchy, the worse the image quality, and the smaller the data volume. The quantized image is an integer matrix, and therefore, the quantization hierarchy needs to be well grasped. When the classification size of the image is limited, in order to obtain an image with good quality and to make the data size smaller, an empirical method may be used to obtain the classification section.
Step 5, positioning a target detection area, and determining a pantograph abrasion area, wherein the method specifically comprises the following steps:
and (3) carrying out derivation on the gray data matrix, namely derivation in the x direction, wherein the formula is as follows:
Figure GDA0003573381280000071
in the formula (I), the compound is shown in the specification,
Figure GDA0003573381280000072
deriving a set of derivatives for the x-direction for the gray scale data matrix; bonding of
Figure GDA0003573381280000073
And F (x), determining the characteristic points, and positioning the target detection area.
F (x) is the image row and matrix,
Figure GDA0003573381280000074
for the image row and the first derivative matrix, a one-by-one comparison of these pixels is required to improve accuracy.
Because of the summation processing of each row, the information characteristics of the image are greatly simplified, the local characteristics of the image can be described on different scales in a simple form, the occupied storage space is small, the running speed is relatively high, and otherwise, redundant information becomes more along with the increase of parameters. After the image is subjected to feature analysis, the feature points are determined, and further the region to be detected of the pantograph image can be determined.
The invention is described in further detail below with reference to the figures and the embodiments.
Example 1
By adopting the pantograph abrasion area positioning detection method, the acquired field image data are tested based on the Matlab platform: selecting a recent train without pantograph fault as a dynamic detection object which is an ATO driving object, so that the train passing speed is stable, the obtained picture quality is good, and the original image of the left half pantograph of the pantograph is collected on site as shown in FIG. 2;
The image after the gaussian filtering preprocessing of the live image data is shown in fig. 3, and the slight noise is obviously eliminated.
The image is subjected to graying processing and contrast enhancement, and as shown in fig. 4, the obtained image has significantly enhanced contrast compared with the original image, a highlighted area is enlarged, a dark portion is blacker, and the distinction is enhanced.
The result of processing it with a gray scale quantization algorithm is shown in fig. 5, where the highlight is graded as all white and the gray black is graded as all black, so that the information of the pantograph is well preserved and well distinguished from the background.
The data set obtained according to the line coordinates is shown in fig. 6, the first derivative is shown in fig. 7, the abscissa of fig. 6 is the line number, the ordinate is the gray sum of the whole line, and the gray sum curve shows obvious distribution due to different distribution of the background and the pantograph. The gray scale and the curve are in the low part, which is the actual dark part area, and the gray scale and the curve are in the high part, which is the bright part area, and the two peaks and valleys well determine the position information of the pantograph.
The abscissa of fig. 7 is the number of rows and the ordinate is the first derivative of the sum of the gray levels of the entire row, which represents the transitions of the dark and light portions, and the feature positions can be well determined from the distribution curves of fig. 6 and 7.
In fig. 6, the average value of the peak of the curve is taken as the upper feature point, the average value of the valley of the curve is taken as the lower feature point, and the specific position of the feature point is determined by taking the first derivative in fig. 7 as the auxiliary judgment, and the point with the value of 0 is taken as the upper and lower feature points respectively; the left and right regions are within a proper range, so that the region to be detected can be defined, as shown in fig. 8.
According to the invention, by fitting the change trend among the regions and dividing the regions, the interference is effectively eliminated before the combination of the multiple regions, the actual interference problem can be effectively solved by only adjusting one set of parameters, and the detection efficiency is greatly improved.

Claims (1)

1. A pantograph abrasion area positioning detection method is characterized by comprising the following steps:
step 1, acquiring image data on site, and performing Gaussian filtering processing on the image;
step 2, carrying out graying processing on the filtered image;
step 3, according to the gray level difference of the image pixels, contrast enhancement is carried out on the image;
step 4, determining a characteristic position by using a grading gray quantization algorithm;
step 5, positioning a target detection area and determining a pantograph abrasion area;
performing graying processing on the filtered image in the step 2 specifically comprises the following steps:
Step 2.1, the image f (x, y) is composed of pixel points, and the formula is as follows:
Figure FDA0003573381270000011
in the formula, an image f (x, y) is an image total pixel set, i, j represents row coordinates and column coordinates of a logic arrangement position of each pixel point, and i belongs to (0, x-1), and j belongs to (0, y-1);
step 2.2, the three-channel brightness maximum value of each pixel point in the color image is taken as a gray value to be processed, and the formula is as follows:
f′(i,j)=max(R(i,j),G(i,j),B(i,j))
in the formula, f' (i, j) is a single-channel pixel gray value of each coordinate point, i, j represents a row coordinate and a column coordinate of a logical arrangement position of a pixel point, and R (i, j), G (i, j), B (i, j) respectively represent a red channel value, a green channel value, and a blue channel value of the coordinate point (i, j);
step 2.3, obtaining a single-channel pixel gray value of the image, wherein the formula is as follows:
f′(x,y)=[f′(i,j)],i∈(0,x-1),j∈(0,y-1)
step 3, according to the gray level difference of the image pixels, contrast enhancement is performed on the image, which specifically comprises the following steps:
step 3.1, the gradient vector of any coordinate point (i, j) of the input image f (x, y) is as follows:
Figure FDA0003573381270000012
in the formula (I), the compound is shown in the specification,
Figure FDA0003573381270000013
a gradient vector of f (i, j);
step 3.2 gradient vector
Figure FDA0003573381270000014
Component G on rows and columnsiAnd GjComprises the following steps:
Figure FDA0003573381270000015
Figure FDA0003573381270000021
step 3.3, the gradient vector obtained
Figure FDA0003573381270000022
Establishing a contrast model l (i, j, m, σ):
Figure FDA0003573381270000023
wherein, sigma is the standard deviation of the image pixel value, and m is the light and shade contrast coefficient;
And 3.4, multiplying by using the contrast model, and then remapping to obtain an image with enhanced contrast, wherein the formula is as follows:
P(i,j)=(li,j)α
in the formula, alpha is a forward enhancement coefficient, and an optimal value is obtained according to an empirical method; l. thei,jThe pixel gray value of the coordinate point (i, j) is calculated by a contrast model to obtain a strengthened value of the coordinate point; p (i, j) represents the enhancement value li,jThe value at coordinate point (i, j) after remapping;
step 4, determining the characteristic position by using a grading gray quantization algorithm, which comprises the following specific steps:
step 4.1, grading according to the gray value, firstly counting a range interval where the gray value of the image coordinate point (i, j) is located, and then replacing the gray value with the value of the upper interval, wherein the formula is as follows:
P′(i,j)=n2,n1<P(i,j)<<n2
g(i,j)=P′(i,j),i∈(0,x-1),j∈(0,y-1)
where g (i, j) is the transformed value and P' (i, j) is the transformed component of P (i, j), where n1 is the lower interval value, n2 is the upper interval value, and n1 ∈ [0, 255], n2 ∈ [0, 255 ];
step 4.2, summing the components according to rows to obtain F (x),
Figure FDA0003573381270000024
the gray scale of each line is obtained, and the formula is as follows:
Figure FDA0003573381270000025
wherein F (x) is the sum of the components g (i, j) in rows;
the positioning of the target detection area in step 5 is specifically as follows:
and (3) carrying out derivation on the gray data matrix, namely derivation in the x direction, wherein the formula is as follows:
Figure FDA0003573381270000031
In the formula (I), the compound is shown in the specification,
Figure FDA0003573381270000032
deriving a set of derivatives for the x-direction for the gray scale data matrix; bonding with
Figure FDA0003573381270000033
And F (x), determining the characteristic points, and positioning the target detection area.
CN201910783246.3A 2019-08-23 2019-08-23 Pantograph abrasion area positioning detection method Active CN110763681B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910783246.3A CN110763681B (en) 2019-08-23 2019-08-23 Pantograph abrasion area positioning detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910783246.3A CN110763681B (en) 2019-08-23 2019-08-23 Pantograph abrasion area positioning detection method

Publications (2)

Publication Number Publication Date
CN110763681A CN110763681A (en) 2020-02-07
CN110763681B true CN110763681B (en) 2022-06-10

Family

ID=69329434

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910783246.3A Active CN110763681B (en) 2019-08-23 2019-08-23 Pantograph abrasion area positioning detection method

Country Status (1)

Country Link
CN (1) CN110763681B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101784887A (en) * 2007-08-06 2010-07-21 昆士兰铁路有限公司 Pantograph damage and wear monitoring system
CN102507601A (en) * 2011-11-08 2012-06-20 南京大学 Online abrasion detection method and system for pantograph of electric locomotive
CN103632361A (en) * 2012-08-20 2014-03-12 阿里巴巴集团控股有限公司 An image segmentation method and a system
JP2015182699A (en) * 2014-03-26 2015-10-22 株式会社明電舎 Inspection equipment for dissolved loss of pantagraph collector head
CN107588733A (en) * 2017-08-21 2018-01-16 南京理工大学 A kind of abrasion of pantograph pan on-line measuring device and method based on image
CN108694349A (en) * 2017-04-07 2018-10-23 成都唐源电气股份有限公司 A kind of pantograph image extraction method and device based on line-scan digital camera

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101784887A (en) * 2007-08-06 2010-07-21 昆士兰铁路有限公司 Pantograph damage and wear monitoring system
CN102507601A (en) * 2011-11-08 2012-06-20 南京大学 Online abrasion detection method and system for pantograph of electric locomotive
CN103632361A (en) * 2012-08-20 2014-03-12 阿里巴巴集团控股有限公司 An image segmentation method and a system
JP2015182699A (en) * 2014-03-26 2015-10-22 株式会社明電舎 Inspection equipment for dissolved loss of pantagraph collector head
CN108694349A (en) * 2017-04-07 2018-10-23 成都唐源电气股份有限公司 A kind of pantograph image extraction method and device based on line-scan digital camera
CN107588733A (en) * 2017-08-21 2018-01-16 南京理工大学 A kind of abrasion of pantograph pan on-line measuring device and method based on image

Also Published As

Publication number Publication date
CN110763681A (en) 2020-02-07

Similar Documents

Publication Publication Date Title
CN110261436B (en) Rail fault detection method and system based on infrared thermal imaging and computer vision
CN103971128B (en) A kind of traffic sign recognition method towards automatic driving car
CN111402247B (en) Machine vision-based method for detecting defects of suspension clamp on power transmission line
Liang et al. Defect detection of rail surface with deep convolutional neural networks
CN111260629A (en) Pantograph structure abnormity detection algorithm based on image processing
CN102855617B (en) Method and system for processing adaptive images
CN105160691A (en) Color histogram based vehicle body color identification method
WO2020134324A1 (en) Image-processing based algorithm for recognizing train number of urban rail train
CN110175556B (en) Remote sensing image cloud detection method based on Sobel operator
CN113324864A (en) Pantograph carbon slide plate abrasion detection method based on deep learning target detection
CN112001299B (en) Tunnel vehicle finger device and lighting lamp fault identification method
CN108389216A (en) Local auto-adaptive threshold segmentation method towards on-line ferrograph image wear Particles Recognition
CN106127124A (en) The automatic testing method of the abnormal image signal in region, taxi front row
CN111709964A (en) PCBA target edge detection method
CN110763681B (en) Pantograph abrasion area positioning detection method
CN109934172B (en) GPS-free full-operation line fault visual detection and positioning method for high-speed train pantograph
CN112800974A (en) Subway rail obstacle detection system and method based on machine vision
CN111339843A (en) Method and device for detecting crowdedness of motor train unit carriage
CN106778675B (en) A kind of recognition methods of target in video image object and device
CN108734158B (en) Real-time train number identification method and device
CN115601558A (en) Single turnout state detection system and detection method and semi-automatic data labeling method
CN115424128A (en) Fault image detection method and system for lower link of freight car bogie
CN109165659A (en) A kind of vehicle color identification method based on super-pixel segmentation
CN114495086A (en) Method and system for identifying lightning protection monitor of electrified railway traction substation
CN112614097B (en) Method for detecting foreign matter on axle box rotating arm of railway train

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Xing Zongyi

Inventor after: Zhou Zhihui

Inventor after: Zhang Yong

Inventor after: Xu Wen

Inventor after: Yang Xing

Inventor after: Yang Shuangyan

Inventor after: Cong Guangtao

Inventor after: Zhang Jiabing

Inventor after: Yang Binhui

Inventor before: Xu Wen

Inventor before: Zhou Zhihui

Inventor before: Zhang Yong

Inventor before: Xing Zongyi

Inventor before: Yang Xing

Inventor before: Yang Shuangyan

Inventor before: Cong Guangtao

Inventor before: Zhang Jiabing

Inventor before: Yang Binhui

GR01 Patent grant
GR01 Patent grant