CN110261436B - Rail fault detection method and system based on infrared thermal imaging and computer vision - Google Patents

Rail fault detection method and system based on infrared thermal imaging and computer vision Download PDF

Info

Publication number
CN110261436B
CN110261436B CN201910509281.6A CN201910509281A CN110261436B CN 110261436 B CN110261436 B CN 110261436B CN 201910509281 A CN201910509281 A CN 201910509281A CN 110261436 B CN110261436 B CN 110261436B
Authority
CN
China
Prior art keywords
image
track
area
value
infrared thermal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910509281.6A
Other languages
Chinese (zh)
Other versions
CN110261436A (en
Inventor
李伟华
张敏
佘佳俊
杨皓然
梁祖懿
雷英佳
张泽恒
谭铭濠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jinan University
Original Assignee
Jinan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jinan University filed Critical Jinan University
Priority to CN201910509281.6A priority Critical patent/CN110261436B/en
Publication of CN110261436A publication Critical patent/CN110261436A/en
Application granted granted Critical
Publication of CN110261436B publication Critical patent/CN110261436B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N25/00Investigating or analyzing materials by the use of thermal means
    • G01N25/72Investigating presence of flaws
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a track fault detection method and a system based on infrared thermal imaging and computer vision, wherein the method comprises the following steps: the unmanned aerial vehicle acquires images of the tramcar track; the ground station receives the image data of the high-definition camera and carries out image preprocessing; performing multi-threshold track area segmentation twice on a darker area in the groove and a lighter area outside the groove, segmenting a track area according to adjacent distance characteristics of the lighter and darker area, and extracting to obtain a track image; graying the infrared thermograph, and extracting a high-temperature area on the track by adopting a relative temperature difference method; superposing the preprocessed image with a track detection window, masking to obtain an interested area, performing edge closing judgment and filling on the interested area to obtain a communicated area, and screening the communicated area to obtain suspected track foreign matters; and inputting suspected track foreign matters into a BP neural network for identification to obtain a foreign matter classification result. The invention can identify foreign matters and detect temperature of the rail in real time, reduce accident rate of rail transit and improve the running safety of the electric car.

Description

Rail fault detection method and system based on infrared thermal imaging and computer vision
Technical Field
The invention relates to the technical field of rail detection, in particular to a rail fault detection method and system based on infrared thermal imaging and computer vision.
Background
Modern trams walk into people's life with green, safe comfortable, nimble convenient characteristics, but compared with the subway, modern tram does not have totally independent right of way, and the tram track overlaps or intersects with the motorway, and when tram traveles with faster speed, great passenger capacity, the foreign matter on the track will cause very big threat to the driving safety of tram. At present, the rail fault detection technology of the tramcar mainly depends on manual operation mode detection and maintenance, namely, the detection speed is low, the consumed time is long, the safety is low, the labor cost is high, the rail fault detection of the tramcar even affects the daily operation of the tramcar, and the traffic of the tramcar rail section is blocked, so that the urban traffic is blocked. The existing barrier detection technology is mainly divided into the following steps according to a detection method: the obstacle detection method based on the radar transmits signals by means of the radar, and obtains distance information between a sensor and a target by measuring physical quantities such as time difference between the transmitted signals and reflected signals. The radar detection method has poor stability, complex debugging equipment and higher cost. The obstacle detection technology based on computer vision mainly relies on a camera installed on a vehicle to acquire image information in front of the vehicle, and a digital image processing technology is used for detecting obstacles.
In summary, the conventional detection technology has certain limitations, and therefore, how to efficiently and accurately detect the fault of the tramway becomes an urgent problem to be solved.
Disclosure of Invention
In order to overcome the defects and shortcomings in the prior art, the invention provides a rail fault detection method and system based on infrared thermal imaging and computer vision, an unmanned aerial vehicle is carried for inspection, comprehensive detection is carried out on rail faults of a trolley by using temperature detection and image recognition, the detection comprises detection of short circuit heating of a rail power supply system and detection of rail foreign matters (such as illegal parking vehicles, abandoned bicycles and boulders), the automatic detection is carried out on the rail foreign matters and the short circuit heating condition of the power supply system by using the unmanned aerial vehicle, the rail fault detection method and system can carry out high-efficiency real-time monitoring under an emergency condition, the accident rate of rail traffic is reduced, and the rail safety is improved.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention provides a rail fault detection method based on infrared thermal imaging and computer vision, which comprises the following steps:
s1: installing a high-definition camera and an infrared thermal imager on an unmanned aerial vehicle, and transmitting an acquired track image back to a ground station in real time in the inspection process of the unmanned aerial vehicle;
s2: image preprocessing: the ground station receives high-definition camera image data and carries out image preprocessing, wherein the image preprocessing comprises image graying, image filtering, image enhancement and edge detection;
s3: extracting a track image: after the image data of the high-definition camera is segmented for one time by adopting a darker gray threshold value in the track groove, segmenting by adopting a lighter gray threshold value outside the track groove, and finally segmenting the track area according to the adjacent distance characteristics of the lighter darker area to extract and obtain a track image;
s4: and (3) infrared temperature detection: according to the extracted position information of the track image in an original image acquired by a high-definition camera, obtaining a corresponding track position in an infrared thermal image by combining the position and angle relation of an infrared thermal imager and the high-definition camera, graying the infrared thermal image received by the infrared thermal imager, extracting a gray value, judging whether a high-temperature area exists on the track by adopting a relative temperature difference method, and if so, extracting the high-temperature area and calculating the area and the highest temperature point;
s5: screening suspected track foreign matters: superposing the preprocessed image and the extracted track image, masking to obtain an interested area, performing edge closing judgment and filling on the interested area to obtain a communicated area, and screening the communicated area to obtain suspected track foreign matters;
s6: track foreign matter identification: and inputting the suspected track foreign matters into a BP neural network for identification to obtain a foreign matter classification result.
As a preferred technical solution, the image preprocessing in step S2 includes image graying, image filtering, image enhancement, and image edge detection, and specifically includes the steps of:
s21: the high-definition camera acquires a color image, and performs graying processing to obtain an image PgrayExpressed as:
Pgray=0.30R+0.59G+0.11B;
wherein R denotes a pixel value of a red component in the color image, G denotes a pixel value of a green component in the color image, and B denotes a pixel value of a blue component in the color image;
s22: carrying out image filtering by adopting a discrete Gaussian filter function, carrying out weighted average on the image, scanning each pixel point in the image by adopting a Gaussian template, and replacing the gray value of the center of the Gaussian template by the weighted average value of a pixel neighborhood, wherein the discrete Gaussian filter function H (i, j) is as follows:
Figure BDA0002092916250000031
wherein, (i, j) represents the coordinates of a point in the neighborhood, and δ represents the standard deviation;
s23: image enhancement is performed by changing the gray value of the image pixel, and the processed image pixel value is g (x, y) and is expressed as:
g(X,y)=[f(x,y)]2
wherein f (x, y) represents the pixel value of the image at the (x, y) point after the image graying and image filtering processing, the image grayscale range is [0,255], and if the calculated result g (x, y) exceeds 255, the value is set to 255;
s24: selecting a Canny detection operator for image edge detection: firstly, performing convolution operation on a Gaussian mask and an image subjected to image graying and image filtering, keeping the information of a single pixel unchanged, then calculating the amplitude and the direction of a gradient by using a first-order partial derivative difference, then performing non-maximum suppression on the amplitude of the gradient, and finally detecting and connecting an image edge by using a dual-threshold method, wherein the amplitude and the direction of the gradient are respectively expressed as follows:
Figure BDA0002092916250000041
Figure BDA0002092916250000042
wherein S isx、SyRepresenting the partial derivatives of the image grey in the x, y direction, respectively.
As a preferred technical solution, in step S3, after the image data of the high definition camera is divided once by using the darker gray threshold in the track groove, the image data of the high definition camera is divided by using the lighter gray threshold outside the track groove, and the specific calculation formula is as follows:
Figure BDA0002092916250000043
Figure BDA0002092916250000044
wherein f (x, y) represents the preprocessed gray scale image, TLIndicating the minimum value of the grey level, T, of the darker region in the troughHRepresenting the maximum value of the grey level of the brighter regions outside the grooves,
Figure BDA0002092916250000045
respectively represents TLAnd THA range of gray levels that fluctuate up and down;
the method comprises the following steps of segmenting a track area according to adjacent distance characteristics of a bright and dark area, and extracting to obtain a track image, wherein the method comprises the following specific steps:
to the darker area binary image g in the grooveL(x,y)Binary image g of the lighter area outside the sum binH(x,y)Expanding to obtain a corresponding region segmentation binary image
Figure BDA0002092916250000046
And
Figure BDA0002092916250000047
and solving the intersection to obtain a track region segmentation binary image gu(x, y), expressed as:
Figure BDA0002092916250000048
segmenting a binary image g in a track regionuDetermining initial pixel points of the two side tracks on (x, y), tracking to obtain a plurality of pixel points on the tracks to obtain a plurality of track lines, extracting the track lines of the two side tracks from the plurality of track lines, performing quadratic fitting by adopting least square segmentation, and constructing a track equation to obtain an extracted track image.
As a preferable technical solution, the relative temperature difference method in the infrared temperature detection in step S4 specifically includes the steps of:
s41: reading a temperature value on a display screen of an infrared thermal imager;
s42: graying the infrared thermal imaging image to obtain an information matrix of a brightness value;
s43: the temperature value and gray value setting mapping relationship is expressed as:
Figure BDA0002092916250000051
wherein G represents a gray value and T represents a temperature;
s44: comparing the detected track temperature result with a temperature value when the track normally works, obtaining the track temperature change trend by adopting curve fitting, and replacing the temperature value with a gray value according to the mapping relation between the temperature and the gray value to obtain a gray threshold value;
s45: the fault area is divided through a gray threshold, different gray values are set, the fault area exceeding the gray threshold is extracted through edge detection, pixels in the fault area are counted to obtain the area, the gray value is compared to obtain the pixel point of the maximum gray value, and the highest temperature point is obtained.
As a preferred technical solution, in step S5, the preprocessed image and the extracted orbit image are superimposed, where the superimposition formula is:
S(i,j)=R(i,j)&ROI;
wherein R (i, j) represents the preprocessed image, ROI represents a region of interest, and S (i, j) represents the operation result image;
the connected region is screened, and the screening formula is as follows:
DArea≥S;
DHeight≥D∩DWidth≥Dlow∩DWidth≤Dhigh
Figure BDA0002092916250000052
wherein D isAreaRepresenting the number of pixels occupied by the connected region, DHeightRepresenting the height of the bounding rectangle outside the connected region, DWidthRepresenting the width of the bounding rectangle outside the connected region, S, D, Dlow、Dhigh、DRatioRespectively representing the area of the suspected track foreign matter, the length of the external minimum rectangle, the width of the external minimum rectangle, the diagonal length of the external minimum rectangle and the rectangle length;
when the screening formula is simultaneously established, the screened communicated area is suspected track foreign matter.
As a preferred technical solution, the training step of the BP neural network is:
s60: numerical initialization: setting the number n of input layer nodes, the number l of hidden layer nodes and the number m of output layers of the BP neural network, and setting the weight from the input layer to the hidden layer (hidden layer) as omegaijThe weight from hidden layer to output layer is omegajkThe threshold value from the input layer to the hidden layer is ajThe threshold from the hidden layer to the output layer is bkLearning rate is eta and excitation function is g (x), the excitation function g (x) adopts Sigmoid function, and is expressed as:
Figure BDA0002092916250000061
wherein x is an input matrix;
s61: inputting a training sample: taking a track image shot by a high-definition camera as an original image, obtaining an image sample containing a foreign matter to be identified, carrying out image graying and binarization processing to obtain a binarized image of the sample, unifying the obtained samples to the size of the same proportion, and inputting the samples into a BP (back propagation) neural network;
s62: judging whether the training sample is loaded completely, if so, executing the next step, and if not, executing the step S61;
s63: let the output of the hidden layer be HjComputing the output of hidden layer neurons:
Figure BDA0002092916250000062
wherein n is the number of nodes of the input layer, omegaijFor weights of input layer to hidden layer, xiIs an input matrix, ajIs the threshold value from the input layer to the hidden layer;
s64: let the output of the output layer be OkComputing the output of the output layer neurons:
Figure BDA0002092916250000071
wherein l is the number of nodes in the hidden layer, omegajkWeights for the hidden layer to the output layer, bkA threshold from the hidden layer to the output layer;
s65: and (3) calculating an error:
Figure BDA0002092916250000072
wherein e iskFor error, m is the number of output layer nodes, YkTo desired output, OkIs the output of the output layer;
s66: updating the weight value:
Figure BDA0002092916250000073
ωjk=ωjk+ηHjek
wherein, ω isijAs weights, ω, of the input layer to the hidden layerjkFor weight from hidden layer to output layer, η is learning rate, HjAs output of the hidden layer, xiFor the input matrix, m is the number of nodes in the output layer, ekIs an error;
s67: updating a threshold value:
Figure BDA0002092916250000074
bk=bk+ηek
wherein, ajFor input of layer to hidden layer thresholds, bkThreshold, ω, for the hidden layer to the output layerjkFor weight from hidden layer to output layer, η is learning rate, HjAs output of the hidden layer, xiFor the input matrix, m is the number of nodes in the output layer, ekIs an error;
s68: and judging whether the difference between the two adjacent errors is smaller than a set value, if so, finishing the training of the BP neural network, and if not, circularly executing the steps S63-S67.
The invention also provides a rail fault detection system based on infrared thermal imaging and computer vision, which comprises: the unmanned aerial vehicle comprises a main control module, a flight control module, a navigation module, a wireless communication module and an aerial photography module, wherein the main control module controls the navigation module, the wireless communication module and the aerial photography module, the flight control module is used for controlling the flight state of the unmanned aerial vehicle, the navigation module is used for providing navigation for the unmanned aerial vehicle, the wireless communication module is used for the unmanned aerial vehicle to carry out wireless communication with the ground station, the aerial photography module is used for acquiring a track image, and the aerial photography module comprises a high-definition camera and an infrared thermal imager;
the ground station comprises an image preprocessing module, a track image extracting module, a high-temperature region extracting module, a suspected track foreign matter screening module and a track foreign matter identifying module, wherein the image preprocessing module is used for preprocessing high-definition camera image data, the track image extracting module is used for dividing a multi-threshold track region and extracting a track image, the high-temperature region extracting module is used for judging relative temperature difference and extracting a high-temperature region and a highest-temperature point in an infrared thermal image, the suspected track foreign matter screening module is used for screening a communicated region to obtain suspected track foreign matters, the screened communicated region is obtained by performing edge closing judgment and filling on an interested region, and the track foreign matter identifying module is provided with a BP (back propagation) neural network and is used for inputting the suspected track foreign matters into the BP neural network to identify so as to obtain a foreign matter classifying result.
As a preferred technical solution, the BP neural network is provided with an input layer, a hidden layer, and an output layer, the number of input nodes of the input layer is set to 20, the number of output nodes of the output layer is set to 3, and the number of hidden layer nodes of the hidden layer is set to:
Figure BDA0002092916250000081
wherein n is the number of nodes in the hidden layer, niIs the number of input nodes, noFor the number of output nodes, a is [1, 10 ]]A constant within the range.
Compared with the prior art, the invention has the following advantages and beneficial effects:
(1) the invention applies the unmanned aerial vehicle to the rail inspection, inspects the rail of the trolley car in the non-operation period of the trolley car or in the sudden failure, has the characteristics of accuracy, high efficiency and wide visual field, when the unmanned aerial vehicle navigates along the rail line, the camera is basically parallel to the ground, the image background does not change greatly, the obtained dynamic background can be approximately regarded as static state, the background interference is greatly reduced, and the effective image information is convenient to extract.
(2) According to the invention, the rail image is extracted through a multi-threshold method and a skeleton extraction idea, the original image is segmented for the first time by utilizing the darker gray threshold in the rail groove, then the original image is segmented by utilizing the brighter gray threshold outside the rail groove, and finally the rail area is segmented according to the adjacent distance characteristics of the brighter darker area, so that accurate rail information is extracted, and the interference of redundant information is effectively avoided.
(3) The method adopts a computer vision technology, effectively reduces the influence of false foreign matters through a suspected track foreign matter screening step, and improves the accuracy of track foreign matter identification by using an artificial neural network to detect the target.
(4) The invention adopts an infrared thermal imaging temperature detection technology, converts the temperature threshold into the gray threshold by utilizing the mapping relation between the temperature and the gray value, can quickly and accurately position and mark the position with overhigh temperature of the track power supply system, returns the fault point information to the ground station, and is convenient for the working personnel to know the fault condition in time and perform the subsequent work.
Drawings
FIG. 1 is a schematic flowchart of a rail fault detection method based on infrared thermal imaging and computer vision according to an embodiment;
FIG. 2 is a schematic diagram of an infrared temperature detection flow of the rail fault detection method based on infrared thermal imaging and computer vision according to the embodiment;
fig. 3 is a schematic diagram illustrating a suspected rail foreign matter screening process of the rail fault detection method based on infrared thermal imaging and computer vision in this embodiment;
FIG. 4 is a schematic diagram of the BP neural network structure according to the present embodiment;
FIG. 5 is a schematic diagram of a BP neural network training process according to this embodiment;
fig. 6 is a schematic structural diagram of the drone of the rail fault detection system based on infrared thermal imaging and computer vision in this embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Examples
As shown in fig. 1, the embodiment provides a rail fault detection method based on infrared thermal imaging and computer vision, which carries an unmanned aerial vehicle, performs comprehensive detection on rail faults of the electric car by using temperature detection and image recognition, and includes detection of short-circuit heating of a rail power supply system and detection of rail foreign matters (illegal parking vehicles, waste bicycles and boulders), and has the advantages of convenience in installation and simplicity in debugging; the visual field is wide, and the fault can be accurately and efficiently detected; the manpower cost is low.
The rail fault detection method based on infrared thermal imaging and computer vision provided by the embodiment is developed in a python environment, an OpenCV computer vision library is used, and an image is analyzed and processed through a temperature threshold conversion method and a BP neural network, and the specific process comprises the following steps:
s1: installing a high-definition camera and an infrared thermal imager on an unmanned aerial vehicle, adjusting angle and focusing parameter information, controlling the unmanned aerial vehicle to carry out inspection according to a preset line, and transmitting image data of a tramcar track image acquired by the camera back to a ground station in real time through a 5G network in the inspection process of the unmanned aerial vehicle;
s2: the method comprises the following steps of image preprocessing, receiving real-time image data by a ground station, firstly performing image preprocessing, including image graying, image filtering, image enhancement, edge detection and the like, reducing the great influence of the change of weather, road conditions and illumination angles on track identification, improving the anti-interference capability, highlighting track information, quickly and accurately detecting track foreign matters and improving images, and specifically comprises the following steps:
s21: the method comprises the following steps that a high-definition camera is received from a ground station to acquire a color image, the color image contains a large amount of information such as object colors, in order to shorten processing time, the calculated amount is reduced by adopting an image graying mode, useful information of the image is highlighted, and a formula is used:
Pgray=0.30R+0.59G+0.11B (1)
wherein R represents a pixel value of a red component in the color image,g represents the pixel value of the green component in the color image, B represents the pixel value of the blue component in the color image, PgrayRepresenting the gray-scale image after the conversion;
s22: in order to improve the image quality and improve the image quality problem caused by noise interference, the image denoising is performed by using gaussian filtering, in the embodiment, each pixel point in the image is scanned by using a 9 × 9 gaussian template, the gaussian filter replaces the gray value of the center of the gaussian template with the weighted mean value of the neighborhood of the pixels, and the image can be smoother because the weight value of each neighborhood pixel point is reduced along with the distance between the pixel point and the center point,
the discrete gaussian filter function used in this embodiment is:
Figure BDA0002092916250000111
h (i, j) is a filter function, (i, j) is the coordinate of a point in the neighborhood, and delta is a standard deviation;
in this embodiment, the coordinates (i, j) of a point in the neighborhood are substituted
Figure BDA0002092916250000112
The obtained Gaussian function value is used as the coefficient of the template;
s23: in order to improve the image definition and contrast, the image effect is enhanced by directly changing the gray value of the image pixel, namely contrast stretching:
g(X,y)=[f(x,y)]2 (3);
wherein f (x, y) is the pixel value of the image at the (x, y) point after the image graying and image filtering processing, g (x, y) is the processed pixel value, the image grayscale range is 0-255, if the calculated result exceeds 255, the image grayscale range is set to 255, the image enhancement method is that the brightness and the contrast of the darker part of the image are weakened, and the brightness and the contrast of the relatively lighter part are strengthened;
s24: selecting a Canny detection operator to obtain a complete image contour; firstly, using Gaussian mask and image after image graying and image filtering treatment to make volumeThe product operation, the information of the single pixel is not changed, secondly, the amplitude and the direction of the gradient are calculated by using the first order partial derivative difference, and S is usedx、SyRepresenting the partial derivative of the image gray in the x, y directions, the magnitude and direction of the gradient are expressed as:
Figure BDA0002092916250000121
Figure BDA0002092916250000122
then the gradient amplitude is used for suppressing non-maximum value, and finally the edge is detected and connected by a double-threshold method.
S3: extracting a track image, wherein the detection range of the fault of the electric car track is mainly between tracks and above the tracks, the electric car track adopts a concave track and has the characteristic of brightness outside a dark groove in a groove, and a multi-threshold track area segmentation method is adopted to segment a dark area in the groove and a bright area outside the groove twice, so that track information is accurately extracted;
the multi-threshold extraction of the track image is to accurately extract track information according to the fact that the detection range of the track fault of the electric car is mainly the features between tracks and above the tracks, firstly, the track area is divided, and then, the track feature points are extracted. Because the tramcar track adopts the concave track, the concave track groove is dark, the groove is bright, the gray value difference is obvious, and the dark area in the groove is adjacent to the bright area outside the groove, so the original image of the high-definition camera received by the ground station is divided once by using the dark gray threshold in the track groove, then the bright gray threshold outside the track groove is used for division, and finally the track area is divided according to the adjacent distance characteristic of the bright dark area to obtain a clear track image. Defining the preprocessed gray image as f (x, y), and determining the minimum value T of gray in the darker area in the grooveLAnd the maximum value T of the gray scale of the bright area outside the grooveH. The dark area division in the groove and the bright area division outside the groove are respectively carried out according to the following formulas:
Figure BDA0002092916250000123
Figure BDA0002092916250000124
wherein the content of the first and second substances,
Figure BDA0002092916250000125
are each TLAnd THUp and down fluctuating gray scale range, to the darker area binary image g in the tankL(x,y)Binary image g of the lighter area outside the sum binH(x,y)Expanding to obtain a corresponding region segmentation binary image
Figure BDA0002092916250000131
And
Figure BDA0002092916250000132
and solving the intersection of the two region segmentation binary images obtained after expansion, and obtaining a region segmentation binary image with complete track regions with greatly reduced interference as shown in the following formula.
Figure BDA0002092916250000133
On the basis of threshold segmentation binary image, firstly searching the initial point pixel of the orbit according to a strict detection criterion, then finding other pixels on a target object by using the idea of skeleton extraction according to the position and the position relation of the points, finally removing the interference line segment by using a certain priori knowledge, and constructing the orbit equation by adopting a least square segmentation quadratic fitting method. The specific process is as follows:
(1) segmenting a binary image g in a track regionuDetermining initial pixel points of the left and right tracks on (x, y);
in gu(x, y) determining possible starting points x of the two-sided trackLn,xRmN, m is 1,2,3 …, defining a left trackStarting point search range [ x ]Ls,xLe]Right track start point search range [ x ]Rx,xRe]Searching the Y-th line, starting from the bottom of the image, and finding the midpoint x of the transverse connected region in the search rangeLn∈{xLs,xLeH, and xRm∈{xRS,xReAll the points are regarded as possible starting points, if no starting point exists in the searching range, Y-1 lines are searched, and the like, if the starting point does not exist in the searching range, the operation is repeated until the Y-th lineminIf the line can not find the starting point, abandoning the search of the starting point of the track at the side;
(2) tracking and acquiring other pixels on the track to obtain a plurality of track lines;
tracking each starting point using a tracking criterion results in a trajectory line, xL1For example, let xL1In the region of transverse communication [ x ]L1s,xL1e]Defining it as the initial value of the tracking search range, where the number of lines is yL1Extend this region to [ x ]L1s-Te,xL1e+Te]As a new row yL1Search Range of-1, TeSearching the central points x of a plurality of connected areas in the area for the number of the pixel points extended left and rightL11,xL12,…,xL1kAll the points are regarded as tracked trace points, and if the trace points cannot be found in the region, y is searchedL12 lines, and so on, if Y continuesrowIf the trace point is tracked in the current row, determining the tracking search range of a new row; let yL1-1 tracing to the trace point, then xL11The left endpoint x of the connected regionL11sAnd xL1kThe right end point x of the connected regionL1keContinuation as a new tracking search range [ x ]L11s-Te,xL1ke+Te]Repeating the previous steps to search a new row of track points until the limit requirement is met;
(3) extracting the longest and most complete left and right track lines from the plurality of track lines;
selecting the longest and most complete trace from the plurality of tracesI.e. the track line. Pixel traced xL1,xL2,…,xLnThe x with the largest number of pixel points in the track line is used as the left track line and is obtained by pixel trackingR1,xR2,…,xRmThe track line with the largest number of pixel points in the strip track line is used as a right track line;
(4) constructing an orbit equation by adopting a least square segmented quadratic fitting method;
and segmenting the collected left and right track characteristic points according to a search sequence, and performing quadratic fitting on every N points by using a least square method. For the condition that the initial point and the end point of the track cannot meet the limit requirement, performing quadratic fitting on the initial N points and the final N points respectively by using a least square method, and prolonging until the limit requirement is met, wherein the specific size of N is determined according to the actual condition;
s4: and (3) infrared temperature detection: according to the extracted position information of the track image in the original image of the high-definition camera, a certain error is eliminated by combining the position and angle relation between an infrared thermal imager and the high-definition camera to obtain a corresponding track position in an infrared thermal image, the infrared thermal image received by the infrared thermal imager is grayed, a gray value is extracted, whether a high-temperature region exists on the track or not is judged by adopting a relative temperature difference method, if so, the high-temperature region is extracted, the area and the highest temperature point of the region are calculated, the record is stored and sent to a worker, and if not, the next frame of image is continuously detected;
as shown in fig. 2, in the infrared temperature detection of this embodiment, based on the mapping characteristics associated with the temperature value and the gray value of the image, the temperature value and the gray value of each pixel point of the infrared thermography are collected, and a preset temperature threshold is converted into a gray threshold to determine a temperature rise region; according to the characteristics of the fault region of the infrared thermography, calculating the area and the mass center of the fault region, and specifically comprising the following steps:
s41: directly reading a temperature value from a display screen of the infrared thermal imager;
s42: carrying out graying processing on the infrared thermograph to obtain an information matrix of a brightness value, wherein the range value is [0,255 ];
s43: the infrared thermography has the characteristics of fuzzy edge, poor contrast and the like, and is not beneficial to machine analysis, the embodiment improves the contrast of an image by using a mapping function of a temperature value and a gray value, a mapping relation exists between the temperature value and the gray value because the gray value data is obtained by direct graying, and a sample point is selected to implement data fitting on the temperature value and the gray value;
Figure BDA0002092916250000151
wherein, T and G represent temperature and gray value respectively;
s44: analyzing the detected track temperature result and a temperature value when the track normally works by adopting a longitudinal comparison (namely a relative temperature difference method), obtaining the track temperature change trend by using a curve fitting method, and replacing the temperature value with a gray value according to a mapping relation of temperature and gray to obtain a gray threshold value;
s45: dividing a fault region through a gray threshold, and calculating the area of the region and a heating center; setting the gray value of the part exceeding the gray threshold value as 255 (displaying as white), setting the gray value of the other part as 0 (displaying as black), and extracting the part exceeding the gray threshold value (namely, the fault region) based on edge detection, in the embodiment, using a region-prop function in MATLAB software to count pixels in the fault region to obtain the region area, and comparing the gray value to obtain the pixel point of the maximum gray value, namely, the heating central point;
in the embodiment, the abnormal line of the rail power supply system is often accompanied by a heating phenomenon, when the contact at the contact of the switchgear, the wire joint and the like is poor, after the current is introduced, the local temperature is increased due to heat loss, and high-temperature detection is performed on the rail and the range between the two rails, so that whether the line is abnormal or not can be further analyzed and judged, and the abnormality can be solved in time;
s5: screening suspected track foreign matters, namely overlapping the preprocessed image with the extracted track image, masking to obtain an interested area, performing edge closing judgment and filling on the interested area, further screening by using a formula, and eliminating interference;
as shown in fig. 3, the suspected rail foreign matter screening method of this embodiment is to superimpose the preprocessed image and the extracted rail image by using a masking method, where the superimposing formula is as follows:
S(i,j)=R(i,j)&ROI (10)
wherein R (i, j) is the preprocessed image, ROI is the region of interest, and the edge detection information in the region of interest is only reserved in the operation result image S (i, j) through the logical operation of R (i, j) and the ROI;
and judging whether the edge of the edge detection image in the region of interest is closed or not, and filling the closed image to be used as a connected region. Screening the statistics such as area, size and duty ratio of the connected region, wherein the screening formula is as follows:
DArea≥S (11)
DHeight≥D∩DWidth≥Dlow∩DWidth≤Dhigh; (12)
Figure BDA0002092916250000161
wherein D isAreaRepresenting the number of pixels occupied by the connected region, DHeightRepresenting the height of the bounding rectangle outside the connected region, DWidthRepresenting the width of the bounding rectangle outside the connected region, S, D, Dlow、Dhigh、DRatioThe shape characteristic constant values respectively represent suspected track foreign matters: the area, the length of the external minimum rectangle, the width of the external minimum rectangle, the diagonal length of the external minimum rectangle and the rectangle length are respectively;
when the three formulas (11), (12) and (13) are simultaneously satisfied, the obtained connected area is the suspected track foreign matter.
S6: and (3) track foreign matter identification, namely, adopting a multilayer feedforward neural network and an error inverse propagation learning algorithm which are most widely applied in an artificial neural network, namely a BP neural network, putting the suspected track foreign matters screened in the last step into a sample library trained in advance for identification to obtain a specific foreign matter result, and storing data and reporting the data to a worker.
In this embodiment, the track foreign object recognition adopts a multi-layer feedforward neural network and an error back propagation learning algorithm, i.e., a BP neural network, as shown in fig. 4, the BP neural network is composed of three layers, an input layer, a hidden layer, and an output layer.
The present example divides the rail foreign matter into three categories: illegal parking vehicles, abandoned bicycles and boulders, so that 20 input nodes are set, 3 output nodes are set, and the selection of hidden layer nodes is determined according to the following formula:
Figure BDA0002092916250000171
wherein n is the number of hidden layer nodes; n isiThe number of input nodes; n isoThe number of output nodes; a takes a constant between 1-10.
As shown in fig. 5, the training procedure of the neural network in this embodiment is as follows:
s60: initializing the weight and the threshold:
determining the number n of input layer nodes of the BP neural network to be 20, the number l of hidden layer nodes to be 8 and the number m of output layer to be 3, and setting the weight from the input layer to the hidden layer to be omegaijThe weight from hidden layer to output layer is omegajkThe threshold value from the input layer to the hidden layer is ajThe threshold from the hidden layer to the output layer is bkLearning rate is eta and excitation function is g (x), wherein the excitation function g (x) takes Sigmoid function and is expressed as:
Figure BDA0002092916250000172
where x is the input matrix, and ω is randomly assignedij,ωjk,aj,bk
S61: inputting a training sample;
the utility model discloses a make a video recording, including the identification foreign matter, the identification foreign matter image sample of this embodiment adopts the RGB multicolour map that has the vehicle of parking violating rules and regulations, abandonment bicycle etc. to patrol and examine many images that pass back with the unmanned aerial vehicle system and as original image, obtains including the identification foreign matter, and the identification foreign matter image sample of this embodiment adopts and has the RGB multicolour map of parking vehicle, abandonment bicycle etc. carry out image graying and binarization processing, obtains the binary image of sample, for the accuracy and the rapidity of training, under the condition that keeps vehicle and bicycle shape characteristic undistorted, unify the sample that obtains to the size of same proportion: 300 pixels high and 200 pixels wide, the method adopted by the embodiment is to complement the background color, namely the black pixel, on the periphery of all samples;
s62: judging whether the training sample is loaded completely, if so, executing the next step, and if not, executing the step S61;
s63: let the output of the hidden layer be HjThe output of each neuron of the hidden layer is calculated according to the following formula:
Figure BDA0002092916250000181
wherein n is the number of nodes of the input layer, omegaijFor weights of input layer to hidden layer, xiIs an input matrix, ajIs the threshold value from the input layer to the hidden layer;
s64: let the output of the output layer be OkThe output of each neuron of the output layer is calculated according to the following formula:
Figure BDA0002092916250000182
wherein l is the number of nodes in the hidden layer, omegajkWeights for the hidden layer to the output layer, bkA threshold from the hidden layer to the output layer;
s65: and (3) calculating an error:
Figure BDA0002092916250000183
wherein e iskFor error, m is the number of output layer nodes, YkTo desired output, OkIs the output of the output layer;
s66: updating the weight value:
Figure BDA0002092916250000184
ωjk=ωjk+ηHjek
wherein, ω isijAs weights, ω, of the input layer to the hidden layerjkFor weight from hidden layer to output layer, η is learning rate, HjAs output of the hidden layer, xiFor the input matrix, m is the number of nodes in the output layer, ekIs an error;
s67: updating a threshold value:
Figure BDA0002092916250000185
bk=bk+ηek
wherein, ajFor input of layer to hidden layer thresholds, bkThreshold, ω, for the hidden layer to the output layerjkFor weight from hidden layer to output layer, η is learning rate, HjAs output of the hidden layer, xiFor the input matrix, m is the number of nodes in the output layer, ekIs an error;
s68: and judging whether the difference between the two adjacent errors is smaller than a specified value, namely whether the training target is reached, if so, finishing the training, and if not, circularly executing the steps S63-S67.
After the neural network is trained, the suspected track foreign matter image obtained in the previous step is placed into the network for recognition, and the type of the foreign matter is determined.
The present embodiment further provides a rail fault detection system based on infrared thermal imaging and computer vision, including: an unmanned aerial vehicle and a ground station;
as shown in fig. 6, the unmanned aerial vehicle comprises a main control module, a flight control module, a navigation module, a wireless communication module and an aerial photography module, wherein the main control module controls the navigation module, the wireless communication module and the aerial photography module, the flight control module is used for controlling the flight state of the unmanned aerial vehicle, the navigation module is used for providing navigation for the unmanned aerial vehicle, the wireless communication module is used for the unmanned aerial vehicle to perform wireless communication with a ground station, the aerial photography module is used for acquiring a track image, the aerial photography module comprises a high-definition camera and an infrared thermal imager, and the high-definition camera and the infrared thermal imager are parallel to the track and are used for acquiring the track image;
in this embodiment, the ground station includes an image preprocessing module, a track image extraction module, a high temperature region extraction module, a suspected track foreign matter screening module and a track foreign matter identification module, the image preprocessing module is used for preprocessing image data of the high-definition camera, the track image extracting module is used for segmenting a multi-threshold track area and extracting a track image, the high-temperature region extracting module is used for judging relative temperature difference, extracting a high-temperature region and a high-temperature point in the infrared thermograph, the suspected track foreign matter screening module is used for screening a communicated area to obtain suspected track foreign matters, the screened communicated area is obtained after the edge closing judgment and filling are carried out on an interested area, the track foreign matter identification module is provided with a BP (back propagation) neural network, and the method is used for inputting the suspected track foreign matters into a BP neural network for identification to obtain a foreign matter classification result.
This embodiment passes through unmanned aerial vehicle, can carry out the omnidirectional to the tram track top within ten minutes and detect, isolate people and fault point and effectively ensured measurement personnel's safety, work danger has been reduced, simultaneously consuming time the weak point has improved fault detection's efficiency, carry out temperature detection and image identification real-time processing with the image that the unmanned aerial vehicle system passed back, and the maintenance personal is informed to the very first time, report fault location, the trouble reason, help developing maintenance work, the high ageing of accuracy is good.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.

Claims (8)

1. A rail fault detection method based on infrared thermal imaging and computer vision is characterized by comprising the following steps:
s1: installing a high-definition camera and an infrared thermal imager on an unmanned aerial vehicle, and transmitting an acquired track image back to a ground station in real time in the inspection process of the unmanned aerial vehicle;
s2: image preprocessing: the ground station receives high-definition camera image data and carries out image preprocessing, wherein the image preprocessing comprises image graying, image filtering, image enhancement and edge detection;
s3: extracting a track image: after the image data of the high-definition camera is segmented for one time by adopting a darker gray threshold value in the track groove, segmenting by adopting a lighter gray threshold value outside the track groove, and finally segmenting the track area according to the adjacent distance characteristics of the lighter darker area to extract and obtain a track image;
s4: and (3) infrared temperature detection: according to the extracted position information of the track image in an original image acquired by a high-definition camera, obtaining a corresponding track position in an infrared thermal image by combining the position and angle relation of an infrared thermal imager and the high-definition camera, graying the infrared thermal image received by the infrared thermal imager, extracting a gray value, judging whether a high-temperature area exists on the track by adopting a relative temperature difference method, and if so, extracting the high-temperature area and calculating the area and the highest temperature point;
s5: screening suspected track foreign matters: superposing the preprocessed image and the extracted track image, masking to obtain an interested area, performing edge closing judgment and filling on the interested area to obtain a communicated area, and screening the communicated area to obtain suspected track foreign matters;
s6: track foreign matter identification: and inputting the suspected track foreign matters into a BP neural network for identification to obtain a foreign matter classification result.
2. The rail fault detection method based on infrared thermal imaging and computer vision as claimed in claim 1, wherein the image preprocessing in step S2 includes image graying, image filtering, image enhancement and image edge detection, and the specific steps are as follows:
s21: the high-definition camera acquires a color image, and performs graying processing to obtain an image PgrayExpressed as:
Pgray=0.30R+0.59G+0.11B;
wherein R denotes a pixel value of a red component in the color image, G denotes a pixel value of a green component in the color image, and B denotes a pixel value of a blue component in the color image;
s22: carrying out image filtering by adopting a discrete Gaussian filter function, carrying out weighted average on the image, scanning each pixel point in the image by adopting a Gaussian template, and replacing the gray value of the center of the Gaussian template by the weighted average value of a pixel neighborhood, wherein the discrete Gaussian filter function H (i, j) is as follows:
Figure FDA0003331276900000021
wherein, (i, j) represents the coordinates of a point in the neighborhood, and δ represents the standard deviation;
s23: image enhancement is performed by changing the gray value of the image pixel, and the processed image pixel value is g (x, y) and is expressed as:
g(x,y)=[f(x,y)]2
wherein f (x, y) represents the pixel value of the image at the (x, y) point after the image graying and image filtering processing, the image grayscale range is [0,255], and if the calculated result g (x, y) exceeds 255, the value is set to 255;
s24: selecting a Canny detection operator for image edge detection: firstly, performing convolution operation on a Gaussian mask and an image subjected to image graying and image filtering, keeping the information of a single pixel unchanged, then calculating the amplitude and the direction of a gradient by using a first-order partial derivative difference, then performing non-maximum suppression on the amplitude of the gradient, and finally detecting and connecting an image edge by using a dual-threshold method, wherein the amplitude and the direction of the gradient are respectively expressed as follows:
Figure FDA0003331276900000022
Figure FDA0003331276900000023
wherein S isx、SyRepresenting the partial derivatives of the image grey in the x, y direction, respectively.
3. The method for detecting track fault based on infrared thermal imaging and computer vision according to claim 1, wherein in step S3, after the image data of the high-definition camera is segmented once by using the darker gray threshold in the track groove, the image data of the high-definition camera is segmented by using the lighter gray threshold outside the track groove, and the specific calculation formula is as follows:
Figure FDA0003331276900000031
Figure FDA0003331276900000032
wherein f (x, y) represents the preprocessed gray scale image, TLIndicating the minimum value of the grey level, T, of the darker region in the troughHRepresenting the maximum value of the grey level of the brighter regions outside the grooves,
Figure FDA0003331276900000033
respectively represents TLAnd THA range of gray levels that fluctuate up and down;
the method comprises the following steps of segmenting a track area according to adjacent distance characteristics of a bright and dark area, and extracting to obtain a track image, wherein the method comprises the following specific steps:
to the darker area binary image g in the grooveL(x,y)Binary image g of the lighter area outside the sum binH(x,y)Expanding to obtain a corresponding region segmentation binary image
Figure FDA0003331276900000034
And
Figure FDA0003331276900000035
and solving the intersection to obtain a track region segmentation binary image gu(x, y), expressed as:
Figure FDA0003331276900000036
segmenting a binary image g in a track regionuDetermining initial pixel points of the two side tracks on (x, y), tracking to obtain a plurality of pixel points on the tracks to obtain a plurality of track lines, extracting the track lines of the two side tracks from the plurality of track lines, performing quadratic fitting by adopting least square segmentation, and constructing a track equation to obtain an extracted track image.
4. The rail fault detection method based on infrared thermal imaging and computer vision as claimed in claim 1, wherein the relative temperature difference method in the infrared temperature detection of step S4 comprises the following specific steps:
s41: reading a temperature value on a display screen of an infrared thermal imager;
s42: graying the infrared thermal imaging image to obtain an information matrix of a brightness value;
s43: the temperature value and gray value setting mapping relationship is expressed as:
Figure FDA0003331276900000037
wherein G represents a gray value and T represents a temperature;
s44: comparing the detected track temperature result with a temperature value when the track normally works, obtaining the track temperature change trend by adopting curve fitting, and replacing the temperature value with a gray value according to the mapping relation between the temperature and the gray value to obtain a gray threshold value;
s45: the fault area is divided through a gray threshold, different gray values are set, the fault area exceeding the gray threshold is extracted through edge detection, pixels in the fault area are counted to obtain the area, the gray value is compared to obtain the pixel point of the maximum gray value, and the highest temperature point is obtained.
5. The method for detecting rail faults based on infrared thermal imaging and computer vision as claimed in claim 1, wherein the step S5 is to superimpose the preprocessed image and the extracted rail image, and the superimposing formula is:
S(i,j)=R(i,j)&ROI;
wherein R (i, j) represents the preprocessed image, ROI represents a region of interest, and S (i, j) represents the operation result image;
the connected region is screened, and the screening formula is as follows:
DArea≥S;
DHeight≥D∩DWidth≥Dlow∩DWidth≤Dhigh
Figure FDA0003331276900000041
wherein D isAreaRepresenting the number of pixels occupied by the connected region, DHeightRepresenting the height of the bounding rectangle outside the connected region, DWidthRepresenting the width of the bounding rectangle outside the connected region, S, D, Dlow、Dhigh、DRatioRespectively representing the area of the suspected track foreign matter, the length of the external minimum rectangle, the width of the external minimum rectangle, the diagonal length of the external minimum rectangle and the rectangle length;
when the screening formula is simultaneously established, the screened communicated area is suspected track foreign matter.
6. The rail fault detection method based on infrared thermal imaging and computer vision as claimed in claim 1, wherein the training step of the BP neural network is:
s60: numerical initialization: setting the number n of input layer nodes, the number l of hidden layer nodes and the number m of output layers of the BP neural network, and setting the weight from the input layer to the hidden layer as omegaijThe weight from hidden layer to output layer is omegajkThe threshold value from the input layer to the hidden layer is ajThe threshold from the hidden layer to the output layer is bkLearning rate is eta and excitation function is g (x), the excitation function g (x) adopts Sigmoid function, and is expressed as:
Figure FDA0003331276900000051
wherein x is an input matrix;
s61: inputting a training sample: taking a track image shot by a high-definition camera as an original image, obtaining an image sample containing a foreign matter to be identified, carrying out image graying and binarization processing to obtain a binarized image of the sample, unifying the obtained samples to the size of the same proportion, and inputting the samples into a BP (back propagation) neural network;
s62: judging whether the training sample is loaded completely, if so, executing the next step, and if not, executing the step S61;
s63: let the output of the hidden layer be HjComputing the output of hidden layer neurons:
Figure FDA0003331276900000052
wherein n is the number of nodes of the input layer, omegaijFor weights of input layer to hidden layer, xiIs an input matrix, ajIs the threshold value from the input layer to the hidden layer;
s64: let the output of the output layer be OkComputing the output of the output layer neurons:
Figure FDA0003331276900000053
wherein l is the number of nodes in the hidden layer, omegajkWeights for the hidden layer to the output layer, bkA threshold from the hidden layer to the output layer;
s65: and (3) calculating an error:
Figure FDA0003331276900000054
wherein e iskFor error, m is the number of output layer nodes, YkTo desired output, OkIs the output of the output layer;
s66: updating the weight value:
Figure FDA0003331276900000061
ωjk=ωjk+ηHjek
wherein, ω isijAs weights, ω, of the input layer to the hidden layerjkFor weight from hidden layer to output layer, η is learning rate, HjAs output of the hidden layer, xiFor the input matrix, m is the number of nodes in the output layer, ekIs an error;
s67: updating a threshold value:
Figure FDA0003331276900000062
bk=bk+ηek
wherein, ajFor input of layer to hidden layer thresholds, bkThreshold, ω, for the hidden layer to the output layerjkFor weight from hidden layer to output layer, η is learning rate, HjAs output of the hidden layer, xiFor the input matrix, m is the number of nodes in the output layer, ekIs an error;
s68: and judging whether the difference between the two adjacent errors is smaller than a set value, if so, finishing the training of the BP neural network, and if not, circularly executing the steps S63-S67.
7. A rail fault detection system based on infrared thermal imaging and computer vision, comprising: the unmanned aerial vehicle comprises a main control module, a flight control module, a navigation module, a wireless communication module and an aerial photography module, wherein the main control module controls the navigation module, the wireless communication module and the aerial photography module, the flight control module is used for controlling the flight state of the unmanned aerial vehicle, the navigation module is used for providing navigation for the unmanned aerial vehicle, the wireless communication module is used for the unmanned aerial vehicle to carry out wireless communication with the ground station, the aerial photography module is used for acquiring a track image, and the aerial photography module comprises a high-definition camera and an infrared thermal imager;
the ground station comprises an image preprocessing module, a track image extracting module, a high-temperature region extracting module, a suspected track foreign matter screening module and a track foreign matter identifying module, wherein the image preprocessing module is used for preprocessing image data of the high-definition camera, the track image extracting module is used for segmenting a multi-threshold track region, the image data of the high-definition camera is segmented for the first time by adopting a dark gray threshold in a track groove, a bright gray threshold outside the track groove is adopted for segmenting, a track region is segmented according to the adjacent distance characteristics of the bright dark region, a track image is extracted, the high-temperature region extracting module is used for judging relative temperature difference, a high-temperature region and a highest temperature point in an infrared thermal image are extracted, the corresponding track position in the infrared thermal image is obtained by combining the position and angle relationship of the infrared thermal imager and the high-definition camera according to the position information of the extracted track image in an original image collected by the high-definition camera, graying an infrared thermal image received by an infrared thermal imager, extracting a gray value, judging whether a high-temperature area exists on the rail by adopting a relative temperature difference method, and if so, extracting the high-temperature area and calculating the area and the highest temperature point; the suspected track foreign matter screening module is used for screening a communicated area to obtain suspected track foreign matters, the screened communicated area is obtained by performing edge closing judgment and filling on an interested area, the interested area is obtained by overlapping a preprocessed image and an extracted track image with a mask, and the track foreign matter identification module is provided with a BP (back propagation) neural network and is used for inputting the suspected track foreign matters into the BP neural network to be identified so as to obtain a foreign matter classification result.
8. The infrared thermal imaging and computer vision based rail fault detection system of claim 7, wherein the BP neural network is provided with an input layer, a hidden layer and an output layer, the number of input nodes of the input layer is set to 20, the number of output nodes of the output layer is set to 3, and the number of hidden layer nodes of the hidden layer is set to:
Figure FDA0003331276900000071
wherein n is the number of nodes in the hidden layer, niIs the number of input nodes, noFor the number of output nodes, a is [1, 10 ]]A constant within the range.
CN201910509281.6A 2019-06-13 2019-06-13 Rail fault detection method and system based on infrared thermal imaging and computer vision Active CN110261436B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910509281.6A CN110261436B (en) 2019-06-13 2019-06-13 Rail fault detection method and system based on infrared thermal imaging and computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910509281.6A CN110261436B (en) 2019-06-13 2019-06-13 Rail fault detection method and system based on infrared thermal imaging and computer vision

Publications (2)

Publication Number Publication Date
CN110261436A CN110261436A (en) 2019-09-20
CN110261436B true CN110261436B (en) 2022-03-22

Family

ID=67917947

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910509281.6A Active CN110261436B (en) 2019-06-13 2019-06-13 Rail fault detection method and system based on infrared thermal imaging and computer vision

Country Status (1)

Country Link
CN (1) CN110261436B (en)

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110991377B (en) * 2019-12-11 2023-09-19 辽宁工业大学 Front mesh identification method of automobile safety auxiliary system based on monocular vision neural network
CN111223097B (en) * 2020-03-13 2024-04-19 中冶长天国际工程有限责任公司 Trolley grate bar paste blocking degree detection method and system of sintering machine
CN111476759B (en) * 2020-03-13 2022-03-25 深圳市鑫信腾机器人科技有限公司 Screen surface detection method and device, terminal and storage medium
CN111260906B (en) * 2020-03-18 2021-11-16 云境商务智能研究院南京有限公司 Intelligent agricultural system based on embedded mode
CN111256841A (en) * 2020-03-26 2020-06-09 深圳市永达电子信息股份有限公司 Track state detection method and detection system thereof
CN111488868B (en) * 2020-03-27 2023-06-30 贵州电网有限责任公司 High-temperature area identification method and system based on transformer infrared image
CN111626104B (en) * 2020-04-13 2023-09-08 国网上海市电力公司 Cable hidden trouble point detection method and device based on unmanned aerial vehicle infrared thermal image
CN111751024A (en) * 2020-06-09 2020-10-09 江苏雷威建设工程有限公司 Intelligent track temperature control method, system and device and storage medium thereof
CN111860404A (en) * 2020-07-28 2020-10-30 华润智慧能源有限公司 Photovoltaic panel hot spot positioning method and system
CN112001963A (en) * 2020-07-31 2020-11-27 浙江大华技术股份有限公司 Fire fighting channel investigation method, system and computer equipment
CN117893491A (en) * 2020-08-07 2024-04-16 长江三峡通航管理局 Ship lock large gear tooth surface oil shortage degree judging method based on oil shortage index
CN111986168B (en) * 2020-08-07 2024-03-15 中国农业大学 T-branch binaural root detection and optimal auricular root temperature measurement frame detection method and system
CN111967526B (en) * 2020-08-20 2023-09-22 东北大学秦皇岛分校 Remote sensing image change detection method and system based on edge mapping and deep learning
CN112051298B (en) * 2020-09-09 2021-06-04 飞础科智慧科技(上海)有限公司 Steel ladle surface fault diagnosis method and equipment
CN112255141B (en) * 2020-10-26 2021-05-11 光谷技术有限公司 Thermal imaging gas monitoring system
CN112947588A (en) * 2021-03-01 2021-06-11 中国南方电网有限责任公司超高压输电公司贵阳局 Unmanned aerial vehicle electric wire netting patrols line system
CN112927223A (en) * 2021-03-29 2021-06-08 南通大学 Glass curtain wall detection method based on infrared thermal imager
CN113110577A (en) * 2021-04-15 2021-07-13 中国南方电网有限责任公司超高压输电公司柳州局 Unmanned aerial vehicle flight route planning management system is patrolled and examined to electric wire netting
CN113591251B (en) * 2021-08-17 2024-04-23 中冶北方(大连)工程技术有限公司 Equipment fault temperature analysis and diagnosis method
CN113673614B (en) * 2021-08-25 2023-12-12 北京航空航天大学 Metro tunnel foreign matter intrusion detection device and method based on machine vision
CN113640308B (en) * 2021-08-31 2024-03-29 夏冰心 Rail anomaly monitoring system based on machine vision
CN116071287A (en) * 2021-10-29 2023-05-05 重庆药羚科技有限公司 Liquid separation process monitoring method and system, storage medium and terminal
CN114019879B (en) * 2021-11-08 2024-03-15 河南数字中原数据有限公司 High-low voltage power distribution cabinet monitoring system
CN114383735B (en) * 2021-12-17 2024-03-26 暨南大学 Thermal power generating unit air cooling array temperature field monitoring method and device based on machine vision
CN114670899A (en) * 2022-04-20 2022-06-28 北京运达华开科技有限公司 Image acquisition device for track detection system
CN115188091B (en) * 2022-07-13 2023-10-13 国网江苏省电力有限公司泰州供电分公司 Unmanned aerial vehicle gridding inspection system and method integrating power transmission and transformation equipment
CN114937040B (en) * 2022-07-22 2022-11-18 珠海优特电力科技股份有限公司 Train inspection method, device and system for rail transit vehicle section and storage medium
CN115460895B (en) * 2022-11-10 2023-02-17 武汉至驱动力科技有限责任公司 Electronic water pump controller heat dissipation method based on temperature field image information
CN116468729B (en) * 2023-06-20 2023-09-12 南昌江铃华翔汽车零部件有限公司 Automobile chassis foreign matter detection method, system and computer
CN116630304B (en) * 2023-07-18 2023-09-19 东莞市京品精密模具有限公司 Lithium battery mold processing detection method and system based on artificial intelligence
CN116721095B (en) * 2023-08-04 2023-11-03 杭州瑞琦信息技术有限公司 Aerial photographing road illumination fault detection method and device
CN116758085B (en) * 2023-08-21 2023-11-03 山东昆仲信息科技有限公司 Visual auxiliary detection method for infrared image of gas pollution
CN117341781B (en) * 2023-12-04 2024-03-22 深圳市鼎善信息科技有限公司 Rail transit fault processing method and device, computer equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0394932B1 (en) * 1989-04-24 1998-03-04 Siemens Aktiengesellschaft Photothermal inspection method, arrangement for its working out, and utilisation of the method
CN102305664A (en) * 2011-05-19 2012-01-04 中国农业大学 Thermal imaging temperature measurement and fault location inspection system
CN102508110A (en) * 2011-10-10 2012-06-20 上海大学 Texture-based insulator fault diagnostic method
CN102608162A (en) * 2012-02-20 2012-07-25 首都师范大学 Threshold segmentation method for ultrasonic infrared thermograph
CN103308521A (en) * 2012-08-29 2013-09-18 中国人民解放军第二炮兵工程大学 Method for enhancing infrared thermal wave detection image defect contrast
CN206132684U (en) * 2016-10-21 2017-04-26 上海工程技术大学 Subway contact net node internal defect detection device
CN108932721A (en) * 2018-06-28 2018-12-04 上海电力学院 A kind of infrared Image Segmentation and fusion method for crusing robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0394932B1 (en) * 1989-04-24 1998-03-04 Siemens Aktiengesellschaft Photothermal inspection method, arrangement for its working out, and utilisation of the method
CN102305664A (en) * 2011-05-19 2012-01-04 中国农业大学 Thermal imaging temperature measurement and fault location inspection system
CN102508110A (en) * 2011-10-10 2012-06-20 上海大学 Texture-based insulator fault diagnostic method
CN102608162A (en) * 2012-02-20 2012-07-25 首都师范大学 Threshold segmentation method for ultrasonic infrared thermograph
CN103308521A (en) * 2012-08-29 2013-09-18 中国人民解放军第二炮兵工程大学 Method for enhancing infrared thermal wave detection image defect contrast
CN206132684U (en) * 2016-10-21 2017-04-26 上海工程技术大学 Subway contact net node internal defect detection device
CN108932721A (en) * 2018-06-28 2018-12-04 上海电力学院 A kind of infrared Image Segmentation and fusion method for crusing robot

Also Published As

Publication number Publication date
CN110261436A (en) 2019-09-20

Similar Documents

Publication Publication Date Title
CN110261436B (en) Rail fault detection method and system based on infrared thermal imaging and computer vision
CN105744232B (en) A kind of method of the transmission line of electricity video external force damage prevention of Behavior-based control analytical technology
CN101030256B (en) Method and apparatus for cutting vehicle image
CN109255350B (en) New energy license plate detection method based on video monitoring
CN110033431B (en) Non-contact detection device and detection method for detecting corrosion area on surface of steel bridge
CN110232380A (en) Fire night scenes restored method based on Mask R-CNN neural network
CN113436157B (en) Vehicle-mounted image identification method for pantograph fault
CN109911550A (en) Scratch board conveyor protective device based on infrared thermal imaging and visible light video analysis
CN111161160B (en) Foggy weather obstacle detection method and device, electronic equipment and storage medium
CN113034378B (en) Method for distinguishing electric automobile from fuel automobile
CN111080650A (en) Method for detecting looseness and loss faults of small part bearing blocking key nut of railway wagon
CN105426894A (en) Image detection method and device for railroad plug nails
CN111582084B (en) Weak supervision learning-based rail foreign matter detection method and system under empty base view angle
CN113111840A (en) Method for early warning violation and dangerous behaviors of operators on fully mechanized coal mining face
CN114772208B (en) Non-contact belt tearing detection system and method based on image segmentation
CN112508911A (en) Rail joint touch net suspension support component crack detection system based on inspection robot and detection method thereof
CN114708532A (en) Monitoring video quality evaluation method, system and storage medium
Saini et al. Feature-based template matching for joggled fishplate detection in railroad track with drone images
Luo et al. Waterdrop removal from hot-rolled steel strip surfaces based on progressive recurrent generative adversarial networks
CN112115767B (en) Tunnel foreign matter detection method based on Retinex and YOLOv3 models
CN114581863A (en) Vehicle dangerous state identification method and system
CN114708544A (en) Intelligent violation monitoring helmet based on edge calculation and monitoring method thereof
CN115861825B (en) 2C detection method based on image recognition
CN117115097B (en) TEDS detection method and system based on anomaly detection
CN115100620B (en) Lane line fitting method based on road color and driving direction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant