CN113947116B - Train track looseness non-contact real-time detection method based on camera shooting - Google Patents

Train track looseness non-contact real-time detection method based on camera shooting Download PDF

Info

Publication number
CN113947116B
CN113947116B CN202111163756.4A CN202111163756A CN113947116B CN 113947116 B CN113947116 B CN 113947116B CN 202111163756 A CN202111163756 A CN 202111163756A CN 113947116 B CN113947116 B CN 113947116B
Authority
CN
China
Prior art keywords
track
gray
virtual feature
pixel
gradient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111163756.4A
Other languages
Chinese (zh)
Other versions
CN113947116A (en
Inventor
徐自力
辛存
王存俊
李康迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN202111163756.4A priority Critical patent/CN113947116B/en
Publication of CN113947116A publication Critical patent/CN113947116A/en
Application granted granted Critical
Publication of CN113947116B publication Critical patent/CN113947116B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

Aiming at the problem that the current train track looseness mainly depends on manual inspection efficiency is low, the invention discloses a camera-based non-contact real-time detection method for train track looseness. The method uses a high-speed camera to acquire a vibration video of a track under the passing of a train or the artificial excitation, and utilizes the characteristics of staggered arrangement between the train track and sleepers to provide a virtual characteristic point detection method based on an image pixel gray gradient and a characteristic clustering algorithm, and adopts an optical flow algorithm to calculate the optical flow of the virtual characteristic points so as to measure the time domain vibration of the track. And obtaining the natural vibration frequency of the track through FFT (fast Fourier transform) decomposition, and further judging whether the track is loosened or not in real time through the change of the natural vibration frequency. The method has the advantages of simple measuring device, high precision, lower cost and easy operation.

Description

Train track looseness non-contact real-time detection method based on camera shooting
Technical Field
The invention belongs to the technical field of structural motion measurement, and particularly relates to a non-contact real-time detection method for train track looseness based on camera shooting.
Background
The train track system is an important part in the transportation system and mainly consists of tracks, sleepers, fasteners and roadbeds. The rail is fixed on the sleeper by the fastener, and in the running process of the train, the periodic impact dynamic load generated by the rail easily causes the fastener to vibrate, so that the rail is loosened. Along with the aggravation of the loosening of the track, the dynamic response amplitude of the track structure obviously becomes larger, and the derailment accident of the train can be caused when the dynamic response amplitude is serious. Therefore, the train track loosening detection has great significance for guaranteeing the running safety of the train.
At present, the conventional detection method for the looseness of the train track is mainly manual inspection, and the inspection personnel with abundant experience usually judge the state of the fastener by naked eyes. The method has the advantages of simple operation, low efficiency, high cost, high omission ratio and large potential safety hazard. In recent years, with the development of image processing technology, computer vision technology is also increasingly applied to structural health monitoring of train tracks. Currently, the detection method of computer vision used requires marking the structure and measuring the track vibration by tracking the marked points. However, since the number of train tracks is large and the train tracks are in an outdoor environment for a long time, the mark points are easy to fall off, and a great challenge is brought to measurement.
Disclosure of Invention
In order to overcome the defects of the prior art and solve the problems of low efficiency and low precision of a train track looseness detection method based on manual inspection, the invention aims to provide a train track looseness non-contact real-time detection method based on camera shooting, and provides a virtual feature point detection method based on an image pixel gray gradient and a feature clustering algorithm by utilizing the feature of staggered arrangement between a train track and a sleeper.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
a non-contact real-time detection method for train track looseness based on camera shooting comprises the following steps:
step 1), video recording is carried out on vibration of a train track by utilizing a high-speed camera;
step 2), carrying out grey treatment on the recorded video frame by frame;
step 3), calculating the gray gradient of the pixels of the first frame image, clustering the gray gradient by adopting a K-means clustering algorithm, determining a track and a fastener area, and taking the gray extreme value of the pixels in the area as a virtual characteristic point of the area;
step 4), constructing an image multi-scale pyramid by an image multi-scale decomposition technology, establishing an optical flow equation of images on different scales based on a short-time brightness constant theory, and calculating optical flows of virtual feature points on the images on different scales by adopting a least square algorithm;
step 5), fusing optical flows of virtual feature points on different scale images by utilizing the scale relation among pyramid images to obtain a calculation result of the optical flows of the virtual feature points, and acquiring time domain vibration signals of the track and the fastener based on an image calibration technology;
and 6) carrying out frequency domain analysis on the time domain vibration signals of the track and the fastener to obtain the natural frequency of the track, and detecting the loosening of the track through the characteristic of the change of the natural frequency.
Further, in the step 2, in order to improve the screening efficiency of the virtual feature points, gray-scale processing is performed on the color image acquired by the camera frame by frame, so as to acquire a gray-scale image:
I9x,y)=0.299*R9x,y)+0.579*G9x,y)+0.114*B(x,y)
wherein: i (x, y) is the gray value of the pixel (x, y); r (x, y), G (x, y), B (x, y) are the pixel values of the three channels of the pixel (x, y), respectively.
Further, the step 3 specifically includes screening the virtual feature points based on the gray gradient of the image pixels and a feature clustering algorithm.
In the present invention, the gradient set in the horizontal direction calculated by the pixels of the first frame image is set asCarrying out cluster analysis on the gradient in the horizontal direction by adopting a K-means clustering algorithm:
wherein: e (E) x The result is the calculation result of the classification square sum in the horizontal direction, and the best clustering is realized when the value of the result is minimum; v is the cluster type, in the present invention v is 2, v=1 represents the track region, v=2 representsOther areas;representing the average value of the gradient in the horizontal direction; l is the number of image pixels.
Let the gradient set in the vertical direction calculated by the pixels of the first frame image beAdopting a K-means clustering algorithm to perform cluster analysis on gradients in the vertical direction:
wherein: e (E) y The calculation result of the classification square sum in the vertical direction is the best cluster when the value is minimum; h is the kind of cluster, in the invention h is 2,h =1 represents the fastener and sleeper area, h=2 represents the other area;representing the average value of the gradient in the vertical direction; l is the number of image pixels.
In the present invention, the detected track and fastener region is denoted as Ω= (Ω) 12 ,...,Ω q ) Q is the number of regions of the track and fastener determined. And using the gray extreme points of pixels in different areas as virtual characteristic points of the areas. For region Ω 1 The pixel gray scale extremum in this region is expressed as:
wherein:is omega 1 In the pixel coordinates of the point with the largest gray value in the region, the point is defined as omega in the present invention 1 Regional virtual feature points->The gray value of the virtual feature point, I (x, y) is the gray value at the pixel (x, y) position, n 2 Is omega 1 The number of pixels in the area.
And detecting gray extreme points in the track and the fastener area at different positions to obtain virtual characteristic points at different positions.
Further, the step 4 is specifically, for the virtual feature point (k x ,k y ) And obtaining optical flow equations of the images under different scales based on the short-time brightness constant theory and the spatial consistency assumption. Selecting the size m multiplied by m of a neighborhood window of the virtual feature point, and according to the principle of pixel motion consistency in the neighborhood, m multiplied by m pixels in the neighborhood of the virtual feature point satisfy the following conditions:
wherein: u, v is the virtual feature point (k x ,k y ) Optical flow in horizontal and vertical directions, (k) x -m,k y -m),…,(k x +m,k y +m) are virtual feature points (k) x ,k y ) In-neighborhood pixel coordinates, I y (k x +m,k y +m) and I y (k x +m,k y +m) represent the gradients of the gray scale of the pixels in the neighborhood of the virtual feature point in the x and y directions, respectively, I t (k x +m,k y +m) represents the derivative of the gray scale of a pixel in the neighborhood of the virtual feature point with respect to time t.
Considering that the equation is a hyperstatic equation, in the invention, the equation is solved by adopting a least square method, so that the optical flow information of the virtual feature points under any scale can be obtained.
Further, in the step 5, the optical flow information of the virtual feature points under different scales is fused according to the scale relation between the images, so as to obtain the optical flow of the virtual feature points. And calculating the vibration of the track and the fastener under the pixel coordinate system by using the optical flow of the virtual characteristic points.
Setting virtual feature points of different framesThe optical flow of (2) is { u } in the horizontal and vertical directions respectively k ,v k I k=1, 2,3,..a., K }, in pixels/frame, K is the total number of frames of video, the motion of the structure can be obtained by the optical flow of virtual feature points:
wherein: m is m x 、m y The motion of the virtual feature point in the horizontal and vertical directions is shown, and f is the shooting frame rate.
In the invention, a grid calibration plate is adopted to calibrate the camera, a scale factor is obtained through the size relation between the grid size and the size of the grid under a pixel coordinate system, and the time domain vibration of the track and the fastener under a physical coordinate system is calculated.
Let the calibrated gray level image be I (x, y), the gray level gradient of the image in the horizontal and vertical directions are respectively:
wherein:for convolution operation, H x 、H y Gradient operators in the x-direction and the y-direction respectively.
The gradient amplitude is:
the size of the grid in the pixel coordinate system can be determined by comparing the gradient amplitude values, and is marked as J, the unit is pixel (pixel), the actual size of the grid is R, the unit is millimeter (mm), and the scale factor SF is:
the time domain vibration of the track is:
wherein: s is S x 、S y The time domain vibration of the rail and the fastener in the horizontal and vertical directions under the physical coordinate system is respectively.
Further, in the step 6, specifically, FFT decomposition is performed on the time domain vibration signal of the track to obtain the natural vibration frequency of the track:
and judging whether the track loosens or not in real time through the change of the natural vibration frequency.
Compared with the existing track looseness detection method, the method has the beneficial effects that:
1) The measuring efficiency is high, and all tracks in the visual field can be monitored at the same time;
2) The virtual feature points are used for replacing the original manual marks without marking, so that the method has a wider application range.
Drawings
Fig. 1 is a schematic diagram of a detection device of a rail fastener monitoring method in the invention.
Fig. 2 is a flow chart of monitoring a rail fastener of the present invention.
Fig. 3 is a schematic diagram of a principle of identifying a track and a fastener region based on a pixel gradient and a clustering algorithm.
Fig. 4 is a schematic diagram of detection of virtual feature points.
FIG. 5 is a schematic diagram showing the pixel position change and gray matrix change over a period of time.
Fig. 6 is a schematic diagram of a scale factor calculation flow.
Fig. 7 is a schematic diagram of a calculation flow of time domain vibration of a fastener.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings and examples.
As shown in fig. 1, the invention provides a non-contact real-time detection method for train track looseness based on camera shooting, which adopts a high-speed camera to obtain a vibration video of a track under the condition that a train passes or is excited by people, and utilizes the characteristics of staggered arrangement between the train track and sleepers. The invention is further described below with reference to the accompanying drawings.
Step 1: as shown in fig. 2, vibration of the rail fastening is video recorded with a high-speed camera.
Step 2: in order to improve the screening efficiency of the virtual feature points, gray scale processing is carried out on the color images acquired by the camera frame by frame, and gray scale images are acquired:
I(x,y)=0.299*R(x,y)+0.579*G(x,y)+0.114*B(x,y) (1)
wherein: i (x, y) is the gray value of the pixel (x, y); r (x, y), G (x, y), B (x, y) are the pixel values of the three channels of the pixel (x, y), respectively.
Step 3: in the present invention, as shown in FIG. 3, the set of gradients in the horizontal direction calculated by the pixels of the first frame image is set to beCarrying out cluster analysis on the gradient in the horizontal direction by adopting a K-means clustering algorithm:
wherein: e (E) x Calculation junction for the sum of squares of the classifications in the horizontal directionIf v is the type of cluster, in the present invention v is 2, v=1 represents the track region, v=2 represents the other region,representing the average of the gradients in the horizontal direction, l being the number of image pixels.
Let the gradient set in the vertical direction calculated by the pixels of the first frame image beAdopting a K-means clustering algorithm to perform cluster analysis on gradients in the vertical direction:
wherein: e (E) y For the calculation result of the classification square sum in the vertical direction, h is the kind of cluster, in the present invention, h is 2,h =1 representing the fastener and sleeper region, h=2 representing the other region,representing the vertical gradient mean, l is the number of image pixels.
In the present invention, as shown in fig. 4, the detected track and fastener region is denoted as Ω= (Ω) 12 ,...,Ω q ) Q is the number of regions of the determined track and fastener, and in the invention, the pixel gray extreme points in different regions are used as virtual characteristic points of the regions. For region Ω 1 The pixel gray scale extremum in this region is expressed as:
wherein:is omega 1 The pixel coordinates of the point with the largest gray value in the region are defined as the point in the inventionIs omega 1 Regional virtual feature points->The gray value of the virtual feature point, I (x, y) is the gray value at the pixel (x, y) position, n 2 Is omega 1 The number of pixels in the area.
And detecting gray extreme points in the track and the fastener area at different positions to obtain virtual characteristic points at different positions.
Step 4: as shown in fig. 5, for the virtual feature point (k x ,k y ) And obtaining optical flow equations of the images under different scales based on the short-time brightness constant theory and the spatial consistency assumption. Selecting the size m multiplied by m of a neighborhood window of the virtual feature point, and according to the principle of pixel motion consistency in the neighborhood, m multiplied by m pixels in the neighborhood of the virtual feature point satisfy the following conditions:
wherein: u, v is the virtual feature point (k x ,k y ) Optical flow in horizontal and vertical directions, (k) x -m,k y -m),…,(k x +m,k y +m) are virtual feature points (k) x ,k y ) In-neighborhood pixel coordinates, I y (k x +m,k y +m) and I y (k x +m,k y +m) represent the gradients of the gray scale of the pixels in the neighborhood of the virtual feature point in the x and y directions, respectively, I t (k x +m,k y +m) represents the derivative of the gray scale of a pixel in the neighborhood of the virtual feature point with respect to time t.
Considering that the equation is a hyperstatic equation, in the invention, the equation is solved by adopting a least square method, so that the optical flow information of the virtual feature points under any scale can be obtained.
Step 5: and according to the scale relation among the images, fusing the optical flow information of the virtual feature points under different scales to obtain the optical flow of the virtual feature points. And calculating the vibration of the track and the fastener under the pixel coordinate system by using the optical flow of the virtual characteristic points.
Let the optical flow of the virtual feature points of different frames be { u } in the horizontal and vertical directions respectively k ,v k I k=1, 2,3,..a., K }, in pixels/frame, K is the total number of frames of video, the motion of the structure can be obtained by the optical flow of virtual feature points:
wherein: m is m x 、m y The motion of the virtual feature point in the horizontal and vertical directions is shown, and f is the shooting frame rate.
In the invention, a grid calibration plate is adopted to calibrate the camera, a scale factor is obtained through the size relation between the grid size and the size of the grid under a pixel coordinate system, and the time domain vibration of the track and the fastener under a physical coordinate system is calculated.
Referring to fig. 6 and 7, let the calibrated gray scale image be I (x, y), the gray scale gradient of the image in the horizontal and vertical directions:
wherein:for convolution operation, H x 、H y Gradient operators in the x-direction and the y-direction respectively.
The gradient amplitude is:
the size of the grid in the pixel coordinate system can be determined by comparing the gradient amplitude values, and is marked as J, the unit is pixel (pixel), the actual size of the grid is R, the unit is millimeter (mm), and the scale factor SF is:
the time domain vibration of the track is:
wherein: s is S x 、S y The time domain vibration of the rail and the fastener in the horizontal and vertical directions under the physical coordinate system is respectively.
Step 6: carrying out FFT (fast Fourier transform) decomposition on the rail time domain vibration signal to obtain the natural vibration frequency of the rail:
and judging whether the track loosens or not in real time through the change of the natural vibration frequency.

Claims (5)

1. The non-contact real-time detection method for train track looseness based on camera shooting is characterized by comprising the following steps of:
step 1), video recording is carried out on vibration of a train track by utilizing a high-speed camera;
step 2), carrying out grey treatment on the recorded video frame by frame;
step 3), calculating the gray gradient of the pixels of the first frame image, clustering the gray gradient by adopting a K-means clustering algorithm, determining a track and a fastener area, and taking the gray extreme value of the pixels in the area as a virtual characteristic point of the area;
step 4), constructing an image multi-scale pyramid by an image multi-scale decomposition technology, establishing an optical flow equation of images on different scales based on a short-time brightness constant theory, and calculating optical flows of virtual feature points on the images on different scales by adopting a least square algorithm;
step 5), fusing optical flows of virtual feature points on different scale images by utilizing the scale relation among pyramid images to obtain a calculation result of the optical flows of the virtual feature points, and acquiring time domain vibration signals of the track and the fastener by combining an image calibration technology;
step 6), carrying out frequency domain analysis on the time domain vibration signals of the track and the fastener to obtain the natural frequency of the track, and detecting the loosening of the track through the characteristic of the change of the natural frequency;
in the step 3), the gray gradient set in the horizontal direction calculated by the pixels of the first frame image is set asAnd adopting a K-means clustering algorithm to perform gray gradient clustering analysis on the horizontal direction:
wherein: e (E) x The result is the calculation result of the classification square sum in the horizontal direction, and the best clustering is realized when the value of the result is minimum; v is the kind of cluster, v=1 represents the track region, v=2 represents the other regions;representing the average value of the gradient in the horizontal direction; l is the number of image pixels;
let the gray gradient set in the vertical direction calculated by the first frame image pixel beCarrying out cluster analysis on gray gradients in the vertical direction by adopting a K-means clustering algorithm:
wherein: e (E) y The calculation result of the classification square sum in the vertical direction is the best cluster when the value is minimum; h is the type of cluster, h=1 represents the fastener and sleeper area, h=2 represents other areas;representing the average value of the gradient in the vertical direction; l is the number of image pixels;
the detected track and fastener area is denoted as Ω= (Ω) 12 ,...,Ω q ) Q is the number of regions of the determined track and fastener, and the pixel gray extreme points in each region are used as virtual characteristic points of the region, for the region omega 1 The pixel gray scale extremum in this region is expressed as:
wherein:is omega 1 Pixel coordinates of a point having the largest gray value in the region, the point being defined as Ω 1 Virtual feature points of the region; />Gray values of the virtual feature points; i (x, y) is the gray value at the pixel (x, y) location; n is n 2 Is omega 1 The number of pixels within the region;
and detecting gray extreme points in the track and the fastener area at different positions to obtain virtual characteristic points at different positions.
2. The method for non-contact real-time detection of train track looseness based on camera shooting according to claim 1, wherein in said step 4), the virtual feature point (k x ,k y ) Based on the short-time brightness constant theory and the space consistency assumption, obtaining optical flow equations of images of different scales, selecting a neighborhood window size of a virtual feature point as m multiplied by m, and according to a pixel motion consistency principle in the neighborhood, m multiplied by m pixels in the neighborhood of the virtual feature point meet a hyperstatic equation:
wherein: u, v is the virtual feature point (k x ,k y ) Optical flow in horizontal and vertical directions; (k) x -m,k y -m),…,(k x +m,k y +m) are virtual feature points (k) x ,k y ) Is determined by the pixel coordinates in the neighborhood of (a); i y (k x +m,k y +m) and I y (k x +m,k y +m) represents the gradient of the gray scale of the pixel in the neighborhood of the virtual feature point in the x and y directions, respectively; i t (k x +m,k y +m) represents the derivative of gray scale of a pixel in the neighborhood of a virtual feature point with respect to time t
And solving the equation by adopting a least square method to obtain optical flows of the virtual feature points on different scales.
3. The method for non-contact real-time detection of train track looseness based on camera shooting of claim 1, wherein in the step 5), the optical flows of virtual feature points of different frames are set to be { u } in the horizontal and vertical directions respectively k ,v k I k=1, 2,3,..a., K }, in pixels/frame, K is the total number of frames of video, the motion of the structure is obtained by the optical flow of virtual feature points:
wherein: m is m x 、m y The motion of the virtual feature points in the horizontal direction and the vertical direction is respectively, and f is the shooting frame rate;
and calibrating the camera by using a grid calibration plate, acquiring a scale factor according to the size relation between the grid size and the size of the grid in a pixel coordinate system, and calculating the time domain vibration of the track and the fastener in a physical coordinate system.
4. The non-contact real-time detection method for train track looseness based on camera shooting according to claim 3, wherein when the camera is calibrated, the calibrated gray scale image is I (x, y), and the gray scale gradients of the image in the horizontal and vertical directions are respectively:
wherein:for convolution operation, H x 、H y Gradient operators in the x direction and the y direction respectively;
the gradient amplitude is:
the size of the grid in the pixel coordinate system can be determined by comparing the gradient amplitude values, and is marked as J, the unit is pixel (pixel), the actual size of the grid is R, the unit is millimeter (mm), and the scale factor SF is:
the time domain vibration of the track is:
wherein: s is S x 、S y The time domain vibration of the rail and the fastener in the horizontal and vertical directions under the physical coordinate system is respectively.
5. The method for non-contact real-time detection of train track looseness based on camera shooting according to claim 1, wherein in the step 6), the time domain vibration signal of the track is subjected to FFT decomposition to obtain the natural vibration frequency of the track:
and judging whether the track loosens or not in real time through the change of the natural vibration frequency.
CN202111163756.4A 2021-09-30 2021-09-30 Train track looseness non-contact real-time detection method based on camera shooting Active CN113947116B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111163756.4A CN113947116B (en) 2021-09-30 2021-09-30 Train track looseness non-contact real-time detection method based on camera shooting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111163756.4A CN113947116B (en) 2021-09-30 2021-09-30 Train track looseness non-contact real-time detection method based on camera shooting

Publications (2)

Publication Number Publication Date
CN113947116A CN113947116A (en) 2022-01-18
CN113947116B true CN113947116B (en) 2023-10-31

Family

ID=79329769

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111163756.4A Active CN113947116B (en) 2021-09-30 2021-09-30 Train track looseness non-contact real-time detection method based on camera shooting

Country Status (1)

Country Link
CN (1) CN113947116B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106845364A (en) * 2016-12-28 2017-06-13 中国航天电子技术研究院 A kind of fast automatic object detection method
CN111532295A (en) * 2019-12-28 2020-08-14 昆山高新轨道交通智能装备有限公司 Rail transit removes intelligent operation and maintenance detecting system
CN112381860A (en) * 2020-11-21 2021-02-19 西安交通大学 Unmarked computer vision method for measuring dynamic frequency of rotating blade
CN112763904A (en) * 2020-12-29 2021-05-07 广州航天海特系统工程有限公司 Circuit breaker detection method, device, equipment and storage medium
WO2021163928A1 (en) * 2020-02-19 2021-08-26 华为技术有限公司 Optical flow obtaining method and apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106845364A (en) * 2016-12-28 2017-06-13 中国航天电子技术研究院 A kind of fast automatic object detection method
CN111532295A (en) * 2019-12-28 2020-08-14 昆山高新轨道交通智能装备有限公司 Rail transit removes intelligent operation and maintenance detecting system
WO2021163928A1 (en) * 2020-02-19 2021-08-26 华为技术有限公司 Optical flow obtaining method and apparatus
CN112381860A (en) * 2020-11-21 2021-02-19 西安交通大学 Unmarked computer vision method for measuring dynamic frequency of rotating blade
CN112763904A (en) * 2020-12-29 2021-05-07 广州航天海特系统工程有限公司 Circuit breaker detection method, device, equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
储林臻 ; 闫钧华 ; 杭谊青 ; 许俊峰 ; .基于改进光流法的旋转运动背景下对地运动目标实时检测.数据采集与处理.2015,(第06期),全文. *
李鹏程 ; 郑树彬 ; 彭乐乐 ; 李立明 ; .轨道图像特征点规律分布研究.计算机测量与控制.2019,(第04期),全文. *

Also Published As

Publication number Publication date
CN113947116A (en) 2022-01-18

Similar Documents

Publication Publication Date Title
CN109238756B (en) Dynamic image detection equipment and detection method for freight car operation fault
CN109029283B (en) Track fastener bolt floating detection method based on height comparison
CN111882882B (en) Method for detecting cross-lane driving behavior of automobile in dynamic flat-plate scale weighing area
CN110567680B (en) Track fastener looseness detection method based on angle comparison
She et al. Feasibility study of asphalt pavement pothole properties measurement using 3D line laser technology
CN105158257A (en) Sliding plate measurement method and device
US20220383478A1 (en) Computer vision-based system and method for assessment of load distribution, load rating, and vibration serviceability of structures
CN102636364B (en) Vehicular safety monitoring system for shapes and structures of bridge floors and detection method
JP2017215220A (en) Railway vehicle appearance inspection device
Pan et al. On-site reliable wheel size measurement based on multisensor data fusion
CN108797241B (en) Track fastener nut looseness detection method based on height comparison
CN106373125A (en) Information entropy-based snowflake noise detection method
CN111754460A (en) Method, system and storage medium for automatically detecting gap of point switch
CN115761487A (en) Method for quickly identifying vibration characteristics of small and medium-span bridges based on machine vision
CN113191239A (en) Vehicle overall dimension dynamic detection system based on computer vision
Qiu et al. Rail fastener positioning based on double template matching
Li et al. A visual inspection system for rail corrugation based on local frequency features
CN113947116B (en) Train track looseness non-contact real-time detection method based on camera shooting
CN111127409A (en) Train component detection method based on SIFT image registration and cosine similarity
CN112949483B (en) Non-contact rail stretching displacement real-time measurement method based on fast R-CNN
Lydon et al. Development and testing of a composite system for bridge health monitoring utilising computer vision and deep learning
CN113610786A (en) Track deformation monitoring method based on visual measurement
CN117953048A (en) Swivel bridge attitude monitoring system and method based on computer vision
Yang et al. Online pantograph-catenary contact point detection in complicated background based on multiple strategies
CN106875378B (en) A kind of power line foreign matter detecting method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant