CN113947116B - A camera-based non-contact real-time detection method for train track looseness - Google Patents

A camera-based non-contact real-time detection method for train track looseness Download PDF

Info

Publication number
CN113947116B
CN113947116B CN202111163756.4A CN202111163756A CN113947116B CN 113947116 B CN113947116 B CN 113947116B CN 202111163756 A CN202111163756 A CN 202111163756A CN 113947116 B CN113947116 B CN 113947116B
Authority
CN
China
Prior art keywords
track
virtual feature
pixel
gradient
feature points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111163756.4A
Other languages
Chinese (zh)
Other versions
CN113947116A (en
Inventor
徐自力
辛存
王存俊
李康迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN202111163756.4A priority Critical patent/CN113947116B/en
Publication of CN113947116A publication Critical patent/CN113947116A/en
Application granted granted Critical
Publication of CN113947116B publication Critical patent/CN113947116B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

针对当前火车轨道松动主要靠人工巡检效率低的问题,本发明公开了一种基于摄像的火车轨道松动非接触实时检测方法。该方法用高速摄像机获取轨道在火车通过或者人为激励下振动视频,利用火车轨道和枕木间交错排布的特征,提出了基于图像像素灰度梯度和特征聚类算法的虚拟特征点检测方法,采用光流算法对虚拟特征点的光流进行计算,进而对轨道的时域振动进行测量。通过FFT分解得到轨道固有振动频率,进而通过固有振动频率的变化实时判断轨道是否发生了松动。所提方法测量装置简单、精度高、成本较低、易于操作。

In view of the current problem of low efficiency of manual inspection for looseness of train tracks, the present invention discloses a camera-based non-contact real-time detection method for looseness of train tracks. This method uses a high-speed camera to obtain a video of the vibration of the track when a train passes or is artificially stimulated. Using the characteristics of the staggered arrangement between the train track and sleepers, a virtual feature point detection method based on image pixel gray gradient and feature clustering algorithm is proposed. The optical flow algorithm calculates the optical flow of virtual feature points, and then measures the time-domain vibration of the track. The natural vibration frequency of the track is obtained through FFT decomposition, and then whether the track is loosened is judged in real time through changes in the natural vibration frequency. The proposed method has a simple measuring device, high precision, low cost and easy operation.

Description

一种基于摄像的火车轨道松动非接触实时检测方法A non-contact real-time detection method of train track looseness based on camera

技术领域Technical field

本发明属于结构运动测量技术领域,特别涉及一种基于摄像的火车轨道松动非接触实时检测方法。The invention belongs to the technical field of structural motion measurement, and in particular relates to a camera-based non-contact real-time detection method of train track looseness.

背景技术Background technique

火车轨道系统是交通运输系统中的重要部分,主要由轨道、枕木、扣件和路基组成。轨道由扣件固定在枕木上,在列车行驶过程中,其产生周期性冲击动载荷容易造成扣件发生振动,长此以往,会引发轨道松动。随着轨道松动的加剧,轨道结构的动力响应振幅明显变大,严重时会造成列车脱轨事故。因此,火车轨道松脱检测对保障列车行驶安全具有重大意义。The train track system is an important part of the transportation system, mainly composed of tracks, sleepers, fasteners and roadbed. The track is fixed to the sleepers by fasteners. When the train is running, the periodic impact dynamic load generated by it can easily cause the fasteners to vibrate. If things go on like this, the track will loosen. As track loosening intensifies, the dynamic response amplitude of the track structure becomes significantly larger, which may cause train derailment accidents in severe cases. Therefore, train track looseness detection is of great significance to ensure the safety of train driving.

目前,常用的火车轨道松动检测方法主要为人工巡检,通常由经验丰富的巡检人员用肉眼判断扣件状态。该方法操作简单、效率低、成本高、漏检率高、安全隐患大。近年来,随着图像处理技术的发展,计算机视觉技术也逐渐被应用于火车轨道的结构健康监测。目前,使用的计算机视觉的检测方法需要对结构进行标记,通过跟踪标记点的方式对轨道振动进行测量。然而,由于火车轨道数量多,且长期处于室外环境,标记点容易脱落,给测量带来极大的挑战。At present, the commonly used method for detecting train track looseness is mainly manual inspection, and experienced inspection personnel usually use the naked eye to judge the status of the fasteners. This method is simple to operate, has low efficiency, high cost, high missed detection rate and great safety hazards. In recent years, with the development of image processing technology, computer vision technology has gradually been applied to structural health monitoring of train tracks. Currently, the computer vision detection method used requires marking the structure and measuring the orbital vibration by tracking the marked points. However, due to the large number of train tracks and their long-term outdoor environment, the marking points are easy to fall off, which brings great challenges to measurement.

发明内容Contents of the invention

为了克服上述现有技术的缺点,解决基于人工巡检的火车轨道松动检测方法效率、精度低的问题,本发明的目的在于提供一种基于摄像的火车轨道松动非接触实时检测方法,利用火车轨道和枕木间交错排布的特征,提出了基于图像像素灰度梯度和特征聚类算法的虚拟特征点检测方法,采用光流算法对虚拟特征点的光流进行计算,进而对轨道的时域振动进行测量,通过FFT分解得到轨道固有振动频率,进而通过固有振动频率的变化实时判断轨道是否发生了松动。In order to overcome the shortcomings of the above-mentioned existing technologies and solve the problems of low efficiency and low accuracy of the train track looseness detection method based on manual inspection, the purpose of the present invention is to provide a camera-based non-contact real-time detection method of train track looseness, using the train track and the characteristics of the staggered arrangement between sleepers, a virtual feature point detection method based on image pixel gray gradient and feature clustering algorithm is proposed. The optical flow algorithm is used to calculate the optical flow of the virtual feature points, and then the time domain vibration of the track is calculated. Measurements are made, and the natural vibration frequency of the track is obtained through FFT decomposition, and then whether the track is loosened is judged in real time through changes in the natural vibration frequency.

为了实现上述目的,本发明采用的技术方案是:In order to achieve the above objects, the technical solution adopted by the present invention is:

一种基于摄像的火车轨道松动非接触实时检测方法,包括以下步骤:A camera-based non-contact real-time detection method for train track looseness, including the following steps:

步骤1),利用高速摄像机对火车轨道的振动进行视频记录;Step 1), use a high-speed camera to video record the vibration of the train track;

步骤2),对记录的视频,逐帧对图像进行灰度化处理;Step 2), perform grayscale processing on the recorded video frame by frame;

步骤3),计算第一帧图像像素的灰度梯度,采用K-means聚类算法对灰度梯度进行聚类,确定轨道和扣件区域,将区域内像素灰度极值作为该区域的虚拟特征点;Step 3), calculate the gray gradient of the pixels in the first frame of the image, use the K-means clustering algorithm to cluster the gray gradient, determine the track and fastener areas, and use the pixel gray extreme values in the area as the virtual value of the area. Feature points;

步骤4),通过图像多尺度分解技术构建图像多尺度金字塔,基于短时亮度恒定理论建立不同尺度上图像的光流方程,采用最小二乘算法计算不同尺度图像上虚拟特征点的光流;Step 4), build an image multi-scale pyramid through image multi-scale decomposition technology, establish optical flow equations for images at different scales based on the short-term brightness constant theory, and use the least squares algorithm to calculate the optical flow of virtual feature points on images at different scales;

步骤5),利用金字塔图像间的尺度关系,对不同尺度图像上虚拟特征点的光流进行融合,得到虚拟特征点光流的计算结果,基于图像标定技术,获取轨道和扣件时域振动信号;Step 5), use the scale relationship between pyramid images to fuse the optical flow of virtual feature points on images of different scales to obtain the calculation results of the optical flow of virtual feature points. Based on the image calibration technology, obtain the time domain vibration signals of the track and fasteners. ;

步骤6),对轨道和扣件时域振动信号进行频域分析,得到轨道的固有频率,通过固有频率变化特征对轨道松动进行检测。Step 6): Conduct frequency domain analysis on the time domain vibration signals of the track and fasteners to obtain the natural frequency of the track, and detect track looseness through the natural frequency change characteristics.

进一步的,所述步骤2具体为,为提高虚拟特征点的筛选效率,对相机获取的彩色图像逐帧进行灰度化处理,获取灰度图像:Further, step 2 is specifically, in order to improve the screening efficiency of virtual feature points, perform grayscale processing on the color images obtained by the camera frame by frame to obtain grayscale images:

I9x,y)=0.299*R9x,y)+0.579*G9x,y)+0.114*B(x,y)I9x,y)=0.299*R9x,y)+0.579*G9x,y)+0.114*B(x,y)

式中:I(x,y)为像素(x,y)的灰度值;R(x,y)、G(x,y)、B(x,y)分别为像素(x,y)的三个通道的像素值。In the formula: I(x,y) is the gray value of pixel (x,y); R(x,y), G(x,y), B(x,y) are the gray value of pixel (x,y) respectively. Pixel values of three channels.

进一步的,所述步骤3具体为,基于图像像素灰度梯度和特征聚类算法对虚拟特征点进行筛选。Further, step 3 specifically includes screening virtual feature points based on image pixel grayscale gradient and feature clustering algorithm.

在本发明中,设第一帧图像像素所计算的水平方向上的梯度集合为采用K-means聚类算法对水平方向上的梯度进行聚类分析:In the present invention, it is assumed that the gradient set in the horizontal direction calculated by the pixels of the first frame of image is The K-means clustering algorithm is used to perform cluster analysis on the gradient in the horizontal direction:

式中:Ex为水平方向上分类平方和的计算结果,其值最小时,为最佳聚类;v为聚类的种类,在本发明中v为2,v=1代表轨道区域,v=2代表其他区域;表示水平方向梯度均值;l为图像像素的数目。In the formula: E =2 represents other areas; represents the mean horizontal gradient; l is the number of image pixels.

设第一帧图像像素所计算的竖直方向上的梯度集合为采用K-means聚类算法对竖直方向上的梯度进行聚类分析:Let the gradient set in the vertical direction calculated by the pixels of the first frame of image be The K-means clustering algorithm is used to perform cluster analysis on the gradient in the vertical direction:

式中:Ey为竖直方向上分类平方和的计算结果,其值最小时,为最佳聚类;h为聚类的种类,在本发明中h为2,h=1代表扣件和枕木区域,h=2代表其他区域;表示竖直方向梯度均值;l为图像像素的数目。In the formula: E y is the calculation result of the sum of squares of classification in the vertical direction. When its value is the smallest, it is the best clustering; h is the type of clustering. In the present invention, h is 2, and h=1 represents the sum of fasteners. Sleeper area, h=2 represents other areas; Represents the vertical gradient mean; l is the number of image pixels.

本发明中,设检测出的轨道和扣件区域记为Ω=(Ω12,...,Ωq),q为确定的轨道和扣件的区域数目。利用不同区域内像素灰度极值点作为该区域的虚拟特征点。对于区域Ω1,该区域内像素灰度极值表示为:In the present invention, it is assumed that the detected track and fastener areas are recorded as Ω = (Ω 1 , Ω 2 ,..., Ω q ), and q is the number of determined track and fastener areas. The pixel grayscale extreme points in different areas are used as virtual feature points of the area. For the area Ω 1 , the extreme value of pixel gray level in this area is expressed as:

式中:为Ω1区域内灰度值最大的点的像素坐标,在本发明中,将该点作为Ω1区域虚拟特征点,/>虚拟特征点的灰度值,I(x,y)为像素(x,y)位置处的灰度值,n2为Ω1区域内的像素数目。In the formula: is the pixel coordinate of the point with the largest grayscale value in the Ω 1 area. In the present invention, this point is used as the virtual feature point in the Ω 1 area,/> The gray value of the virtual feature point, I (x, y) is the gray value at the pixel (x, y) position, and n 2 is the number of pixels in the Ω 1 area.

对不同位置处的轨道和扣件区域内的灰度极值点进行检测,获得不同位置处的虚拟特征点。The grayscale extreme points in the track and fastener areas at different positions are detected to obtain virtual feature points at different positions.

进一步的,所述步骤4具体为,对于虚拟特征点(kx,ky),基于短时亮度恒定理论及空间一致性假设得到不同尺度下图像的光流方程。选取虚拟特征点邻域窗口大小m×m,根据邻域内像素运动一致性原则,虚拟特征点邻域内m×m个像素满足:Further, the step 4 is specifically, for the virtual feature point (k x , ky ), obtain the optical flow equation of the image at different scales based on the short-term brightness constant theory and the spatial consistency assumption. Select the virtual feature point neighborhood window size m×m. According to the principle of pixel motion consistency in the neighborhood, m×m pixels in the virtual feature point neighborhood satisfy:

式中:u,v为虚拟特征点(kx,ky)在水平和竖直方向上的光流,(kx-m,ky-m),…,(kx+m,ky+m)分别为虚拟特征点(kx,ky)的邻域内像素坐标,Iy(kx+m,ky+m)和Iy(kx+m,ky+m)分别表示虚拟特征点邻域内像素的灰度在x和y方向上的梯度,It(kx+m,ky+m)表示虚拟特征点邻域内像素的灰度关于时间t的导数。In the formula: u, v are the optical flows of virtual feature points (k x , k y ) in the horizontal and vertical directions, (k x -m, k y -m),..., (k x +m, k y +m) are respectively the pixel coordinates in the neighborhood of the virtual feature point (k x , k y ), and I y (k x +m, k y +m) and I y (k x +m, k y +m) respectively represent The gradient of the grayscale of the pixels in the virtual feature point neighborhood in the x and y directions, I t (k x +m, k y +m) represents the derivative of the grayscale of the pixels in the virtual feature point neighborhood with respect to time t.

考虑上述方程为超静定方程,在本发明中,采用最小二乘法对方程进行求解,可得任意尺度下虚拟特征点的光流信息。Considering that the above equation is a statically determinate equation, in the present invention, the least squares method is used to solve the equation, and the optical flow information of the virtual feature point at any scale can be obtained.

进一步的,所述步骤5具体为,根据图像间的尺度关系,将不同尺度下虚拟特征点的光流信息进行融合,得到虚拟特征点光流。利用虚拟特征点的光流,计算像素坐标系下的轨道和扣件的振动。Further, the step 5 is specifically to fuse the optical flow information of the virtual feature points at different scales according to the scale relationship between the images to obtain the virtual feature point optical flow. The optical flow of virtual feature points is used to calculate the vibration of the track and fasteners in the pixel coordinate system.

设不同帧虚拟特征点的光流在水平和竖直方向上分别为{uk,vk|k=1,2,3,...,K},单位为像素/帧,K为视频的总帧数,通过虚拟特征点的光流可以得到结构的运动:Assume that the optical flow of virtual feature points in different frames is {u k ,v k |k=1,2,3,...,K} in the horizontal and vertical directions respectively, the unit is pixel/frame, and K is the video The total number of frames, the movement of the structure can be obtained through the optical flow of virtual feature points:

式中:mx、my分别为虚拟特征点在水平和竖直方向上的运动,f为拍摄帧率。In the formula: m x and m y are the movements of virtual feature points in the horizontal and vertical directions respectively, and f is the shooting frame rate.

在本发明中,采用格子标定板对相机进行标定,通过格子大小与其在像素坐标系下的大小关系获取尺度因子,计算物理坐标系下轨道和扣件的时域振动。In the present invention, a grid calibration plate is used to calibrate the camera, the scale factor is obtained through the relationship between the grid size and its size in the pixel coordinate system, and the time domain vibration of the track and fasteners in the physical coordinate system is calculated.

设标定的灰度图像为I(x,y),图像在水平和竖直方向上的灰度梯度分别为:Let the calibrated grayscale image be I(x,y), and the grayscale gradients of the image in the horizontal and vertical directions are:

式中:为卷积运算,Hx、Hy分别为x方向及y方向上的梯度算子。In the formula: is the convolution operation, H x and H y are the gradient operators in the x direction and y direction respectively.

梯度幅值为:The gradient amplitude is:

通过对比梯度幅值可以确定格子在像素坐标系下的大小,记为J,单位为像素(pixel),格子实际大小为R,单位为毫米(mm),则尺度因子SF为:By comparing the gradient amplitude, the size of the grid in the pixel coordinate system can be determined, which is recorded as J and the unit is pixel (pixel). The actual size of the grid is R and the unit is millimeter (mm). Then the scale factor SF is:

轨道的时域振动为:The time domain vibration of the orbit is:

式中:Sx、Sy分别为物理坐标系下轨道和扣件在水平和竖直方向上的时域振动。In the formula: S x and S y are the time domain vibrations of the track and fastener in the horizontal and vertical directions respectively under the physical coordinate system.

进一步的,所述步骤6具体为,对轨道时域振动信号进行FFT分解,得到轨道固有振动频率:Further, the step 6 is specifically to perform FFT decomposition on the track time domain vibration signal to obtain the track natural vibration frequency:

通过固有振动频率的变化实时判断轨道是否发生了松动。It can be judged in real time whether the track is loosened by the change of natural vibration frequency.

与现有的轨道松动检测方法相比,本发明的有益效果为:Compared with existing track looseness detection methods, the beneficial effects of the present invention are:

1)测量效率高,能够同时对视野内的所有轨道进行监测;1) The measurement efficiency is high and all tracks within the field of view can be monitored simultaneously;

2)无需标记,用虚拟特征点替代了原有的人工标记,具有更广的适用范围。2) No marking is required. Virtual feature points are used to replace the original manual marking, which has a wider scope of application.

附图说明Description of the drawings

图1为本发明中钢轨扣件监测方法的检测装置示意图。Figure 1 is a schematic diagram of the detection device of the rail fastener monitoring method in the present invention.

图2为本发明中钢轨扣件的监测流程图。Figure 2 is a flow chart of monitoring rail fasteners in the present invention.

图3为基于像素梯度与聚类算法的轨道和扣件区域辨识原理示意图。Figure 3 is a schematic diagram of the principle of track and fastener area identification based on pixel gradient and clustering algorithms.

图4为虚拟特征点的检测示意图。Figure 4 is a schematic diagram of the detection of virtual feature points.

图5为一段时间内像素位置变化及灰度矩阵变化示意图。Figure 5 is a schematic diagram of changes in pixel position and grayscale matrix over a period of time.

图6为尺度因子的计算流程示意图。Figure 6 is a schematic diagram of the calculation process of the scale factor.

图7为扣件时域振动的计算流程示意图。Figure 7 is a schematic diagram of the calculation process of fastener time domain vibration.

具体实施方式Detailed ways

下面结合附图和实施例详细说明本发明的实施方式。The embodiments of the present invention will be described in detail below with reference to the drawings and examples.

如图1所示,本发明一种基于摄像的火车轨道松动非接触实时检测方法,采用高速摄像机获取轨道在火车通过或者人为激励下振动视频,利用火车轨道和枕木间交错排布的特征,提出了基于图像像素灰度梯度和特征聚类算法的虚拟特征点检测方法,采用光流算法对虚拟特征点的光流进行计算,进而对轨道的时域振动进行测量,通过FFT分解得到轨道固有振动频率,进而通过固有振动频率的变化实时判断轨道是否发生了松动。下面结合附图,对本发明做进一步描述。As shown in Figure 1, the present invention uses a camera-based non-contact real-time detection method for train track looseness. It uses a high-speed camera to obtain a video of the vibration of the track when a train passes or is artificially stimulated. Taking advantage of the staggered arrangement characteristics of the train track and sleepers, it is proposed that A virtual feature point detection method based on image pixel gray gradient and feature clustering algorithm is proposed. The optical flow algorithm is used to calculate the optical flow of the virtual feature points, and then the time domain vibration of the track is measured. The natural vibration of the track is obtained through FFT decomposition. frequency, and then determine whether the track is loosened in real time through changes in the natural vibration frequency. The present invention will be further described below in conjunction with the accompanying drawings.

步骤1:如图2所示,用高速摄像机对轨道扣件的振动进行视频记录。Step 1: As shown in Figure 2, use a high-speed camera to video record the vibration of the track fastener.

步骤2:为提高虚拟特征点的筛选效率,对相机获取的彩色图像逐帧进行灰度处理,获取灰度图像:Step 2: In order to improve the screening efficiency of virtual feature points, perform grayscale processing on the color image obtained by the camera frame by frame to obtain a grayscale image:

I(x,y)=0.299*R(x,y)+0.579*G(x,y)+0.114*B(x,y) (1)I(x,y)=0.299*R(x,y)+0.579*G(x,y)+0.114*B(x,y) (1)

式中:I(x,y)为像素(x,y)的灰度值;R(x,y)、G(x,y)、B(x,y)分别为像素(x,y)的三个通道的像素值。In the formula: I(x,y) is the gray value of pixel (x,y); R(x,y), G(x,y), B(x,y) are the gray value of pixel (x,y) respectively. Pixel values of three channels.

步骤3:如图3所示,在本发明中,设第一帧图像像素所计算的水平方向上的梯度集合为采用K-means聚类算法对水平方向上的梯度进行聚类分析:Step 3: As shown in Figure 3, in the present invention, the gradient set in the horizontal direction calculated by the pixels of the first frame of the image is The K-means clustering algorithm is used to perform cluster analysis on the gradient in the horizontal direction:

式中:Ex为水平方向上分类平方和的计算结果,v为聚类的种类,在本发明中v为2,v=1代表轨道区域,v=2代表其他区域,表示水平方向梯度均值,l为图像像素的数目。In the formula: E represents the mean horizontal gradient, and l is the number of image pixels.

设第一帧图像像素所计算的竖直方向上的梯度集合为采用K-means聚类算法对竖直方向上的梯度进行聚类分析:Let the gradient set in the vertical direction calculated by the pixels of the first frame of the image be The K-means clustering algorithm is used to perform cluster analysis on the gradient in the vertical direction:

式中:Ey为竖直方向上分类平方和的计算结果,h为聚类的种类,在本发明中h为2,h=1代表扣件和枕木区域,h=2代表其他区域,表示竖直方向梯度均值,l为图像像素的数目。In the formula: E y is the calculation result of the sum of squares of classification in the vertical direction, h is the type of clustering, in the present invention, h is 2, h=1 represents the fastener and sleeper area, h=2 represents other areas, represents the vertical gradient mean, and l is the number of image pixels.

在本发明中,如图4所示,设检测出的轨道和扣件区域记为Ω=(Ω12,...,Ωq),q为确定的轨道和扣件的区域数目,在本发明中,利用不同区域内像素灰度极值点作为该区域的虚拟特征点。对于区域Ω1,该区域内像素灰度极值表示为:In the present invention, as shown in Figure 4, it is assumed that the detected track and fastener areas are recorded as Ω = (Ω 1 , Ω 2 ,..., Ω q ), and q is the number of determined track and fastener areas. , in the present invention, the extreme value points of pixel grayscale in different areas are used as the virtual feature points of the area. For the area Ω 1 , the extreme value of pixel gray level in this area is expressed as:

式中:为Ω1区域内灰度值最大的点的像素坐标,在本发明中,将该点作为Ω1区域虚拟特征点,/>虚拟特征点的灰度值,I(x,y)为像素(x,y)位置处的灰度值,n2为Ω1区域内的像素数目。In the formula: is the pixel coordinate of the point with the largest grayscale value in the Ω 1 area. In the present invention, this point is used as the virtual feature point in the Ω 1 area,/> The gray value of the virtual feature point, I (x, y) is the gray value at the pixel (x, y) position, and n 2 is the number of pixels in the Ω 1 area.

对不同位置处的轨道和扣件区域内的灰度极值点进行检测,获得不同位置处的虚拟特征点。The grayscale extreme points in the track and fastener areas at different positions are detected to obtain virtual feature points at different positions.

步骤4:如图5所示,对于虚拟特征点(kx,ky),基于短时亮度恒定理论及空间一致性假设得到不同尺度下图像的光流方程。选取虚拟特征点邻域窗口大小m×m,根据邻域内像素运动一致性原则,虚拟特征点邻域内m×m个像素满足:Step 4: As shown in Figure 5, for the virtual feature points (k x , k y ), the optical flow equations of images at different scales are obtained based on the short-term brightness constant theory and the spatial consistency assumption. Select the virtual feature point neighborhood window size m×m. According to the principle of pixel motion consistency in the neighborhood, m×m pixels in the virtual feature point neighborhood satisfy:

式中:u,v为虚拟特征点(kx,ky)在水平和竖直方向上的光流,(kx-m,ky-m),…,(kx+m,ky+m)分别为虚拟特征点(kx,ky)的邻域内像素坐标,Iy(kx+m,ky+m)和Iy(kx+m,ky+m)分别表示虚拟特征点邻域内像素的灰度在x和y方向上的梯度,It(kx+m,ky+m)表示虚拟特征点邻域内像素的灰度关于时间t的导数。In the formula: u, v are the optical flows of virtual feature points (k x , k y ) in the horizontal and vertical directions, (k x -m, k y -m),..., (k x +m, k y +m) are respectively the pixel coordinates in the neighborhood of the virtual feature point (k x , k y ), and I y (k x +m, k y +m) and I y (k x +m, k y +m) respectively represent The gradient of the grayscale of the pixels in the virtual feature point neighborhood in the x and y directions, I t (k x +m, k y +m) represents the derivative of the grayscale of the pixels in the virtual feature point neighborhood with respect to time t.

考虑上述方程为超静定方程,在本发明中,采用最小二乘法对方程进行求解,可得任意尺度下虚拟特征点的光流信息。Considering that the above equation is a statically determinate equation, in the present invention, the least squares method is used to solve the equation, and the optical flow information of the virtual feature point at any scale can be obtained.

步骤5:根据图像间的尺度关系,将不同尺度下虚拟特征点的光流信息进行融合,得到虚拟特征点光流。利用虚拟特征点的光流,计算像素坐标系下的轨道和扣件的振动。Step 5: According to the scale relationship between images, fuse the optical flow information of virtual feature points at different scales to obtain the virtual feature point optical flow. The optical flow of virtual feature points is used to calculate the vibration of the track and fasteners in the pixel coordinate system.

设不同帧虚拟特征点的光流在水平和竖直方向上分别为{uk,vk|k=1,2,3,...,K},单位为像素/帧,K为视频的总帧数,通过虚拟特征点的光流可以得到结构的运动:Assume that the optical flow of virtual feature points in different frames is {u k ,v k |k=1,2,3,...,K} in the horizontal and vertical directions respectively, the unit is pixel/frame, and K is the video The total number of frames, the movement of the structure can be obtained through the optical flow of virtual feature points:

式中:mx、my分别为虚拟特征点在水平和竖直方向上的运动,f为拍摄帧率。In the formula: m x and m y are the movements of virtual feature points in the horizontal and vertical directions respectively, and f is the shooting frame rate.

在本发明中,采用格子标定板对相机进行标定,通过格子大小与其在像素坐标系下的大小关系获取尺度因子,计算物理坐标系下轨道和扣件的时域振动。In the present invention, a grid calibration plate is used to calibrate the camera, the scale factor is obtained through the relationship between the grid size and its size in the pixel coordinate system, and the time domain vibration of the track and fasteners in the physical coordinate system is calculated.

参考图6和图7,设标定的灰度图像为I(x,y),图像在水平和竖直方向上的灰度梯度:Referring to Figure 6 and Figure 7, let the calibrated grayscale image be I(x,y), and the grayscale gradient of the image in the horizontal and vertical directions:

式中:为卷积运算,Hx、Hy分别为x方向及y方向上的梯度算子。In the formula: is the convolution operation, H x and H y are the gradient operators in the x direction and y direction respectively.

梯度幅值为:The gradient amplitude is:

通过对比梯度幅值可以确定格子在像素坐标系下的大小,记为J,单位为像素(pixel),格子实际大小为R,单位为毫米(mm),则尺度因子SF为:By comparing the gradient amplitude, the size of the grid in the pixel coordinate system can be determined, which is recorded as J and the unit is pixel (pixel). The actual size of the grid is R and the unit is millimeter (mm). Then the scale factor SF is:

轨道的时域振动为:The time domain vibration of the orbit is:

式中:Sx、Sy分别为物理坐标系下轨道和扣件在水平和竖直方向上的时域振动。In the formula: S x and S y are the time domain vibrations of the track and fastener in the horizontal and vertical directions respectively under the physical coordinate system.

步骤6:对轨道时域振动信号进行FFT分解,得到轨道固有振动频率:Step 6: Perform FFT decomposition of the orbit time domain vibration signal to obtain the orbit’s natural vibration frequency:

通过固有振动频率的变化实时判断轨道是否发生了松动。It can be judged in real time whether the track is loosened by the change of natural vibration frequency.

Claims (5)

1.一种基于摄像的火车轨道松动非接触实时检测方法,其特征在于,包括以下步骤:1. A camera-based non-contact real-time detection method of train track looseness, which is characterized by including the following steps: 步骤1),利用高速摄像机对火车轨道的振动进行视频记录;Step 1), use a high-speed camera to video record the vibration of the train track; 步骤2),对记录的视频,逐帧对图像进行灰度化处理;Step 2), perform grayscale processing on the recorded video frame by frame; 步骤3),计算第一帧图像像素的灰度梯度,采用K-means聚类算法对灰度梯度进行聚类,确定轨道和扣件区域,将区域内像素灰度极值作为该区域的虚拟特征点;Step 3), calculate the gray gradient of the pixels in the first frame of the image, use the K-means clustering algorithm to cluster the gray gradient, determine the track and fastener areas, and use the pixel gray extreme values in the area as the virtual value of the area. Feature points; 步骤4),通过图像多尺度分解技术构建图像多尺度金字塔,基于短时亮度恒定理论建立不同尺度上图像的光流方程,采用最小二乘算法计算不同尺度图像上虚拟特征点的光流;Step 4), build an image multi-scale pyramid through image multi-scale decomposition technology, establish optical flow equations for images at different scales based on the short-term brightness constant theory, and use the least squares algorithm to calculate the optical flow of virtual feature points on images at different scales; 步骤5),利用金字塔图像间的尺度关系,对不同尺度图像上虚拟特征点的光流进行融合,得到虚拟特征点光流的计算结果,结合图像标定技术,获取轨道和扣件时域振动信号;Step 5), use the scale relationship between pyramid images to fuse the optical flow of virtual feature points on images of different scales to obtain the calculation results of the optical flow of virtual feature points. Combined with image calibration technology, obtain the time domain vibration signals of the track and fasteners. ; 步骤6),对轨道和扣件时域振动信号进行频域分析,得到轨道的固有频率,通过固有频率变化特征对轨道松动进行检测;Step 6): Conduct frequency domain analysis on the time domain vibration signals of the track and fasteners to obtain the natural frequency of the track, and detect track looseness through the natural frequency change characteristics; 其中,所述步骤3)中,设第一帧图像像素所计算的水平方向上的灰度梯度集合为采用K-means聚类算法对水平方向上的灰度梯度聚类分析:Among them, in step 3), it is assumed that the set of gray gradients in the horizontal direction calculated by the pixels of the first frame of the image is The K-means clustering algorithm is used to analyze the gray gradient clustering in the horizontal direction: 式中:Ex为水平方向上分类平方和的计算结果,其值最小时,为最佳聚类;v为聚类的种类,v=1代表轨道区域,v=2代表其他区域;表示水平方向梯度均值;l为图像像素的数目;In the formula: E represents the mean horizontal gradient; l is the number of image pixels; 设第一帧图像像素所计算的竖直方向上的灰度梯度集合为采用K-means聚类算法对竖直方向上的灰度梯度进行聚类分析:Let the gray gradient set in the vertical direction calculated by the pixels of the first frame of image be The K-means clustering algorithm is used to perform cluster analysis on the gray gradient in the vertical direction: 式中:Ey为竖直方向上分类平方和的计算结果,其值最小时,为最佳聚类;h为聚类的种类,h=1代表扣件和枕木区域,h=2代表其他区域;表示竖直方向梯度均值;l为图像像素的数目;In the formula: E y is the calculation result of the sum of squares of classification in the vertical direction. When its value is the smallest, it is the best cluster; h is the type of cluster, h=1 represents the fastener and sleeper area, h=2 represents other area; Represents the vertical gradient mean; l is the number of image pixels; 将检测出的轨道和扣件区域记为Ω=(Ω12,...,Ωq),q为确定的轨道和扣件的区域数目,利用各个区域内像素灰度极值点作为该区域的虚拟特征点,对于区域Ω1,该区域内像素灰度极值表示为:The detected track and fastener areas are recorded as Ω = (Ω 1 , Ω 2 ,..., Ω q ), q is the number of determined track and fastener areas, and the pixel grayscale extreme points in each area are used. As the virtual feature point of this area, for the area Ω 1 , the extreme value of the pixel gray level in this area is expressed as: 式中:为Ω1区域内灰度值最大的点的像素坐标,将该点作为Ω1区域虚拟特征点;/>虚拟特征点的灰度值;I(x,y)为像素(x,y)位置处的灰度值;n2为Ω1区域内的像素数目;In the formula: is the pixel coordinate of the point with the largest grayscale value in the Ω 1 area, and this point is regarded as the virtual feature point in the Ω 1 area;/> The gray value of the virtual feature point; I (x, y) is the gray value at the pixel (x, y) position; n 2 is the number of pixels in the Ω 1 area; 对不同位置处的轨道和扣件区域内的灰度极值点进行检测,获得不同位置处的虚拟特征点。The grayscale extreme points in the track and fastener areas at different positions are detected to obtain virtual feature points at different positions. 2.根据权利要求1所述基于摄像的火车轨道松动非接触实时检测方法,其特征在于,所述步骤4)中,对于虚拟特征点(kx,ky),基于短时亮度恒定理论及空间一致性假设得到不同尺度图像的光流方程,选取虚拟特征点邻域窗口大小为m×m,根据邻域内像素运动一致性原则,虚拟特征点邻域内m×m个像素满足超静定方程:2. The camera-based non-contact real-time detection method of train track looseness according to claim 1, characterized in that in step 4), for the virtual feature point (k x , ky ), based on the short-term brightness constant theory and The spatial consistency assumption is used to obtain the optical flow equation of images of different scales. The size of the virtual feature point neighborhood window is selected to be m×m. According to the consistency principle of pixel motion in the neighborhood, m×m pixels in the virtual feature point neighborhood satisfy the superstatically definite equation. : 式中:u,v为虚拟特征点(kx,ky)在水平和竖直方向上的光流;(kx-m,ky-m),…,(kx+m,ky+m)分别为虚拟特征点(kx,ky)的邻域内像素坐标;Iy(kx+m,ky+m)和Iy(kx+m,ky+m)分别表示虚拟特征点邻域内像素的灰度在x和y方向上的梯度;It(kx+m,ky+m)表示虚拟特征点邻域内像素的灰度关于时间t的导数In the formula: u, v are the optical flow of the virtual feature point (k x , k y ) in the horizontal and vertical directions; (k x -m, k y -m),..., (k x +m, k y +m) are respectively the pixel coordinates in the neighborhood of the virtual feature point (k x , k y ); I y (k x +m, k y +m) and I y (k x +m, k y +m) respectively represent The gradient of the grayscale of pixels in the neighborhood of virtual feature points in the x and y directions; I t (k x +m, k y +m) represents the derivative of the grayscale of pixels in the neighborhood of virtual feature points with respect to time t 采用最小二乘法对方程进行求解,获得不同尺度上虚拟特征点的光流。The least squares method is used to solve the equation and obtain the optical flow of virtual feature points on different scales. 3.根据权利要求1所述基于摄像的火车轨道松动非接触实时检测方法,其特征在于,所述步骤5)中,设不同帧虚拟特征点的光流在水平和竖直方向上分别为{uk,vk|k=1,2,3,...,K},单位为像素/帧,K为视频的总帧数,通过虚拟特征点的光流得到结构的运动:3. The camera-based non-contact real-time detection method of train track looseness according to claim 1, characterized in that in step 5), the optical flow of virtual feature points in different frames is assumed to be { in the horizontal and vertical directions respectively. u k ,v k |k=1,2,3,...,K}, the unit is pixel/frame, K is the total number of frames of the video, and the movement of the structure is obtained through the optical flow of virtual feature points: 式中:mx、my分别为虚拟特征点在水平和竖直方向上的运动,f为拍摄帧率;In the formula: m x and m y are the movements of virtual feature points in the horizontal and vertical directions respectively, and f is the shooting frame rate; 采用格子标定板对相机进行标定,通过格子大小与其在像素坐标系下的大小关系获取尺度因子,计算物理坐标系下轨道和扣件的时域振动。A grid calibration plate is used to calibrate the camera. The scale factor is obtained through the relationship between the grid size and its size in the pixel coordinate system, and the time domain vibration of the track and fasteners in the physical coordinate system is calculated. 4.根据权利要求3所述基于摄像的火车轨道松动非接触实时检测方法,其特征在于,所述对相机进行标定时,标定的灰度图像为I(x,y),图像在水平和竖直方向上的灰度梯度分别为:4. The camera-based non-contact real-time detection method of train track looseness according to claim 3, characterized in that when the camera is calibrated, the calibrated grayscale image is I(x, y), and the image is horizontally and vertically. The gray gradients in the straight direction are: 式中:为卷积运算,Hx、Hy分别为x方向及y方向上的梯度算子;In the formula: is the convolution operation, H x and H y are the gradient operators in the x direction and y direction respectively; 梯度幅值为:The gradient amplitude is: 通过对比梯度幅值可以确定格子在像素坐标系下的大小,记为J,单位为像素(pixel),格子实际大小为R,单位为毫米(mm),则尺度因子SF为:By comparing the gradient amplitude, the size of the grid in the pixel coordinate system can be determined, which is recorded as J and the unit is pixel (pixel). The actual size of the grid is R and the unit is millimeter (mm). Then the scale factor SF is: 轨道的时域振动为:The time domain vibration of the orbit is: 式中:Sx、Sy分别为物理坐标系下轨道和扣件在水平和竖直方向上的时域振动。In the formula: S x and S y are the time domain vibrations of the track and fastener in the horizontal and vertical directions respectively under the physical coordinate system. 5.根据权利要求1所述基于摄像的火车轨道松动非接触实时检测方法,其特征在于,所述步骤6)中,对轨道时域振动信号进行FFT分解,得到轨道固有振动频率:5. The imaging-based non-contact real-time detection method of train track looseness according to claim 1, characterized in that in step 6), the track time domain vibration signal is subjected to FFT decomposition to obtain the track natural vibration frequency: 通过固有振动频率的变化实时判断轨道是否发生了松动。It can be judged in real time whether the track is loosened by the change of natural vibration frequency.
CN202111163756.4A 2021-09-30 2021-09-30 A camera-based non-contact real-time detection method for train track looseness Active CN113947116B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111163756.4A CN113947116B (en) 2021-09-30 2021-09-30 A camera-based non-contact real-time detection method for train track looseness

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111163756.4A CN113947116B (en) 2021-09-30 2021-09-30 A camera-based non-contact real-time detection method for train track looseness

Publications (2)

Publication Number Publication Date
CN113947116A CN113947116A (en) 2022-01-18
CN113947116B true CN113947116B (en) 2023-10-31

Family

ID=79329769

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111163756.4A Active CN113947116B (en) 2021-09-30 2021-09-30 A camera-based non-contact real-time detection method for train track looseness

Country Status (1)

Country Link
CN (1) CN113947116B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106845364A (en) * 2016-12-28 2017-06-13 中国航天电子技术研究院 A kind of fast automatic object detection method
CN111532295A (en) * 2019-12-28 2020-08-14 昆山高新轨道交通智能装备有限公司 Rail transit removes intelligent operation and maintenance detecting system
CN112381860A (en) * 2020-11-21 2021-02-19 西安交通大学 Unmarked computer vision method for measuring dynamic frequency of rotating blade
CN112763904A (en) * 2020-12-29 2021-05-07 广州航天海特系统工程有限公司 Circuit breaker detection method, device, equipment and storage medium
WO2021163928A1 (en) * 2020-02-19 2021-08-26 华为技术有限公司 Optical flow obtaining method and apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106845364A (en) * 2016-12-28 2017-06-13 中国航天电子技术研究院 A kind of fast automatic object detection method
CN111532295A (en) * 2019-12-28 2020-08-14 昆山高新轨道交通智能装备有限公司 Rail transit removes intelligent operation and maintenance detecting system
WO2021163928A1 (en) * 2020-02-19 2021-08-26 华为技术有限公司 Optical flow obtaining method and apparatus
CN112381860A (en) * 2020-11-21 2021-02-19 西安交通大学 Unmarked computer vision method for measuring dynamic frequency of rotating blade
CN112763904A (en) * 2020-12-29 2021-05-07 广州航天海特系统工程有限公司 Circuit breaker detection method, device, equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
储林臻 ; 闫钧华 ; 杭谊青 ; 许俊峰 ; .基于改进光流法的旋转运动背景下对地运动目标实时检测.数据采集与处理.2015,(第06期),全文. *
李鹏程 ; 郑树彬 ; 彭乐乐 ; 李立明 ; .轨道图像特征点规律分布研究.计算机测量与控制.2019,(第04期),全文. *

Also Published As

Publication number Publication date
CN113947116A (en) 2022-01-18

Similar Documents

Publication Publication Date Title
Lu et al. Vision-based structural displacement measurement under ambient-light changes via deep learning and digital image processing
CN109029283B (en) Track fastener bolt floating detection method based on height comparison
Shi et al. Deep learning based virtual point tracking for real-time target-less dynamic displacement measurement in railway applications
CN110979399B (en) Dynamic detection method for high-speed railway track condition
CN107678036A (en) A kind of vehicle-mounted contactless contact net geometric parameter dynamic detection system and method
CN110567680B (en) Track fastener looseness detection method based on angle comparison
CN111692985B (en) Constant-load deflection analysis method for single-span simply-supported girder bridge under traffic passing condition
Pan et al. On-site reliable wheel size measurement based on multisensor data fusion
CN102636364B (en) Vehicular safety monitoring system for shapes and structures of bridge floors and detection method
CN111003018A (en) A system and method for dynamic detection of high-speed railway track conditions
CN108797241B (en) Track fastener nut looseness detection method based on height comparison
CN112986069B (en) A Ballast Particle Deterioration Index Analyzer
CN110490163B (en) Intelligent processing method and device for railway video data
CN109060290A (en) The method that wind-tunnel density field is measured based on video and Sub-pixel Technique
Lydon et al. Development and testing of a composite system for bridge health monitoring utilising computer vision and deep learning
Qiu et al. Rail fastener positioning based on double template matching
CN104916078B (en) Intermittent rainfall induces the Detection of Stability method of accumulation type slope model
CN113947116B (en) A camera-based non-contact real-time detection method for train track looseness
Li et al. A visual inspection system for rail corrugation based on local frequency features
CN106482648A (en) Based on the absolute monitoring device of thin tail sheep in the long-distance plane of fixed point and method
CN115761487A (en) A fast identification method for vibration characteristics of small and medium-span bridges based on machine vision
CN206049709U (en) A kind of railroad survey gauge measures detection means
CN106644034A (en) Non-contact high-speed rail road foundation vibration measurement system
CN113610786A (en) Track deformation monitoring method based on visual measurement
CN107491786A (en) A kind of tobacco purchase repeats weigh behavior Automatic Visual Inspection and recognition methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant