CN115752250A - Bridge high-precision displacement monitoring method fusing computer vision and acceleration - Google Patents
Bridge high-precision displacement monitoring method fusing computer vision and acceleration Download PDFInfo
- Publication number
- CN115752250A CN115752250A CN202211378696.2A CN202211378696A CN115752250A CN 115752250 A CN115752250 A CN 115752250A CN 202211378696 A CN202211378696 A CN 202211378696A CN 115752250 A CN115752250 A CN 115752250A
- Authority
- CN
- China
- Prior art keywords
- displacement
- measuring point
- image
- time
- acceleration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a bridge high-precision displacement monitoring method fusing computer vision and acceleration, which comprises the following steps: shooting a vibration time-course image sequence of a measuring point on the bridge structure under the action of load by using image acquisition equipment, and synchronously acquiring an actually-measured acceleration response of the measuring point; continuously tracking the positions of the measuring points based on a template matching algorithm according to the vibration time-course image sequence to obtain a displacement time-course curve of the measuring points under the image coordinates; carrying out numerical integration on the measured acceleration response of the measuring point to obtain the dynamic displacement of the measuring point; carrying out band-pass filtering on a displacement time-course curve and dynamic displacement of the measuring point under an image coordinate, and fitting a scale factor by using a least square method; and carrying out low-pass filtering on the displacement time-course curve of the measuring point under the image coordinate, carrying out high-pass filtering on the dynamic displacement, and carrying out data fusion on the dynamic displacement and the dynamic displacement to obtain the high-precision displacement of the measuring point. The invention integrates the advantages of computer vision and acceleration, and has the advantages of simple calculation process, small calculation amount and higher displacement calculation precision.
Description
Technical Field
The invention belongs to the field of structural engineering, and particularly relates to a bridge high-precision displacement monitoring method fusing computer vision and acceleration.
Background
The displacement response of the structure under the action of load is a key parameter for the health monitoring and safety state evaluation of the structure. The rigidity information of the structure can be obtained by monitoring the displacement, and the abnormity of the local displacement can represent that the rigidity damage occurs locally on the structure.
The existing methods for measuring the displacement of the actual operation bridge can be mainly divided into a direct method and an indirect method. The direct method is to directly measure the displacement of the structure by using a contact type or non-contact type displacement sensor. The contact type sensor comprises a dial indicator, a linear variable differential transformer displacement gauge (LVDT), a total station, a level gauge and the like. The percentage table and the LVDT can obtain real-time accurate displacement results, but a measuring platform needs to be built under the bridge, a sensor needs to be manually installed close to the bottom surface of the bridge, the operation is complicated in actual operation, and the method is not suitable for bridges with high water striding and under-bridge clearance; the total station and the level instrument have low sampling frequency, can only acquire static displacement of a structure, and cannot measure real-time displacement response under dynamic load. To overcome the shortcomings of the touch sensor, a great deal of research has been conducted including a Global Positioning System (GPS), a laser doppler vibrometer (LDS), a radar interferometry system, and a non-contact measurement method based on computer vision. The GPS can measure three-dimensional coordinates of points on the structure, but cannot measure real-time dynamic displacement response due to its limited sampling frequency and measurement accuracy. LDS can realize high-precision and high-frequency measurement, but the cost is high, and only single-point measurement can be met. The radar interferometry can realize multi-point high-precision measurement, but before measurement, a reflecting surface needs to be installed on the structure, and the operation is complex.
The non-contact displacement measuring method based on computer vision is widely researched and partially applied due to the advantages of simplicity, convenience, low cost, full-field measurement and remote measurement. The main principle is that a camera is used for shooting a video of a measured structure, a target tracking algorithm is used for image processing to obtain the motion track of a measuring point in an image, and the real deformation information of the structure is obtained through conversion of the image scale and the real scale. However, in practical measurement, since the sampling frequency of a common consumer-grade camera is mostly 20 to 60Hz, the high-frequency response of the structure cannot be acquired; in addition, the optical axis of the camera is required to be perpendicular to the deformation direction of the structure in the measurement, otherwise, a large error is caused in the conversion process of the image size and the real size; finally, the size of the distance or structure to be measured and the corresponding number of pixels in the image are calculated from the scale factor of the image size converted into the real size.
Indirect methods refer to measuring the response of other physical quantities of the structure and then calculating the displacement from the physical quantities. For example, the displacement may be calculated by quadratic integration of the acceleration. However, the conventional acceleration sensor has a poor response at a very low frequency, and the numerical integration process amplifies low-frequency noise. The high frequency displacement component calculated from the acceleration is considered more reliable than the low frequency displacement component.
Disclosure of Invention
Aiming at the defects in the prior art, the bridge high-precision displacement monitoring method fusing computer vision and acceleration solves the problems that the conventional acceleration sensor has poor ultralow frequency response and the numerical integration process can amplify low-frequency noise; the problem that the high-frequency response of the structure cannot be acquired by the sampling frequency of the common consumer-grade camera.
In order to achieve the purpose of the invention, the invention adopts the technical scheme that: a bridge high-precision displacement monitoring method fusing computer vision and acceleration comprises the following steps:
s1, shooting a vibration time-course image sequence of a measuring point on a bridge structure under the action of a load by using image acquisition equipment, and synchronously acquiring an actually-measured acceleration response of the measuring point;
s2, continuously tracking the positions of the measuring points based on a template matching algorithm according to the vibration time-course image sequence to obtain a displacement time-course curve of the measuring points under the image coordinates;
s3, carrying out numerical integration on the measured acceleration response of the measuring point to obtain the dynamic displacement of the measuring point;
s4, performing band-pass filtering on the displacement time-course curve and the dynamic displacement of the measuring point under the image coordinate, and fitting a scale factor by using a least square method;
and S5, carrying out low-pass filtering on the displacement time-course curve of the measuring point under the image coordinate, carrying out high-pass filtering on the dynamic displacement, and carrying out data fusion on the dynamic displacement and the dynamic displacement to obtain the high-precision displacement of the measuring point.
Further, the specific implementation manner of step S1 is as follows:
arranging an acceleration sensor at a target measuring point of a target monitoring bridge, and acquiring vertical vibration acceleration of the bridge under the action of load by using a matched data acquisition system to obtain an actually-measured acceleration response of the acquisition measuring point; fixing the image acquisition equipment by using a tripod at a stable position outside the bridge, and adjusting the angle of the equipment to enable the measuring point to be positioned in the middle of a view field to obtain a vibration time-course image sequence.
Further, the specific implementation manner of step S2 is as follows:
s2-1, converting the vibration time-course image sequence into a gray level image sequence;
s2-2, framing an image subset containing target measuring points in the first frame of image, namely a template;
s2-3, predicting the motion range of the template according to experience, and framing an image subset which can contain the motion range of the template, namely a region of interest (ROI);
s2-4, according to a formula:
obtaining a ROI normalization correlation coefficient(ii) a Wherein T (m, n) is a template selected from the first frame image;the gray level average value of all pixel points in the template T (m, n); m i,j (m, n) is a region of interest ROI of the image to be subjected to correlation coefficient calculation;covering M for template T (M, n) i,j The gray level mean value of all pixel points on the image subset of the (m, n) part; (i, j) is coordinates after the template is translated; m is the width of the template, and N is the height of the template;
s2-6, according to the formula:
F=ax 2 +by 2 +cx+dy+e
obtaining the sub-pixel best matching point coordinate (x) 0 ,y 0 ) (ii) a Wherein A is the amplitude of the correlation coefficient; sigma x Is the standard deviation in the x-direction; sigma y Is the standard deviation in the y-direction; x is the number of 0 The x coordinate of the optimal matching point of the sub-pixel is taken as the coordinate; y is 0 The y coordinate of the optimal matching point of the sub-pixel is obtained; i (x, y) at x = x 0 And y = y 0 The time reaches the maximum value; f = ln I (x, y);
s2-7, obtaining the sub-pixel coordinates of the target measuring point of each frame image in the subsequent frames, and connecting the sub-pixel coordinates of each frame in series in time to obtain a sub-pixel displacement time course curve of the target measuring point in an image coordinate system, and marking the curve as u im (t im )。
Further, the specific implementation manner of step S3 is as follows:
according to the formula:
μ=46.81N -1.95
C=(L T L+μ 2 I) -1 L T L a'
obtaining a regularization coefficient mu and a dynamic displacement u of a measuring point a' =Ca'(Δt) 2 (ii) a Wherein L = L a' L c ;L a' The diagonal matrix of order 2k +1 with the values of the head and tail elements of the diagonalThe rest elements are 1; l is c An order linear operator (2k + 1) (2k + 3); 2k +1 is an acceleration data point number in a calculation time window; delta t is the time interval of two adjacent sampling points; n is the total number of acceleration data points; i is an identity matrix of order (2k + 3); c is a coefficient matrix; (. Cndot.) T Represents a transpose of a matrix; a' represents the measured acceleration vector.
Further, the specific implementation manner of step S4 is as follows:
s4-1, according to a formula:
H 3 =y 1 ζ 0 (x)+y 2 ζ 1 (x)+m 0 η 0 (x)+m 1 η 1 (x)
according to u im (t) and u a' Time-synchronous cross-correlation coefficient R xy (τ) is u a' Realizing time synchronization with coordinate sub-pixel displacement; obtaining the polynomial pair u using a cubic Hermite interpolation im (t im ) Upsampling is performed, and u is summed a' Sub-pixel displacement u of coordinates with same sampling time interval im (t); wherein, the first and the second end of the pipe are connected with each other, the coordinates of two points to be interpolated are respectively (x) 1 ,y 1 ) And (x) 2 ,y 2 );m 0 Is a point x 1 A derivative value of (d); m is a unit of 1 Is a point x 2 The derivative value of (d); y is 1 Is a point x 1 A function value of (b); y is 2 Is a point x 2 A function value of (b); t is the measurement time of the random signals x (T) and y (T); tau is the time difference of the two signals; t represents time t;
s4-2, respectively carrying out band-pass filtering on the coordinate sub-pixel displacement of the measuring point and the dynamic displacement of the measuring point, and enabling the lower cut-off frequencySetting the lower limit of the frequency response range of the acceleration sensor; will cut off the frequencySetting the sampling frequency of the image acquisition equipment to be 1/10-1/2 to obtain the sub-pixel displacement of the same-frequency component coordinateDynamic displacement of sum point
S4-3, according to a formula:
a scaling factor SF is obtained.
Further, the specific implementation manner of step S5 is as follows:
s5-1, acquiring a pair of complementary filters H with amplitude sum of 1 and phase sum of 0 L And H H ;
S5-2, according to a formula:
for u is paired im (t) use of a filter H L Low-pass filtering is carried out to obtain the sub-pixel displacement of the quasi-static image coordinateFor u to u a' Using filters H H Carrying out high-pass filtering to obtain dynamic displacement u Dtnamic (t);C L And C H Is the filter coefficient; j represents the j-th acceleration data point number in a calculation time window;
s5-3, according to a formula:
and obtaining the high-precision displacement u (t) of the measuring point.
The invention has the beneficial effects that: according to the method, a low-frequency component in visual displacement is used as quasi-static displacement, a high-frequency part of displacement obtained by acceleration integration is used as dynamic displacement, and the high-frequency part and the dynamic displacement are subjected to data fusion to obtain high-precision displacement; the method has the advantages of simple process, small operand and higher precision.
Drawings
FIG. 1 is a flow chart of the present invention;
fig. 2 is a schematic diagram of a template matching algorithm.
Detailed Description
The following description of the embodiments of the present invention is provided to facilitate the understanding of the present invention by those skilled in the art, but it should be understood that the present invention is not limited to the scope of the embodiments, and it will be apparent to those skilled in the art that various changes may be made without departing from the spirit and scope of the invention as defined and defined in the appended claims, and all matters produced by the invention using the inventive concept are protected.
As shown in fig. 1, a method for monitoring high-precision displacement of a bridge by fusing computer vision and acceleration comprises the following steps:
s1, shooting a vibration time-course image sequence of a measuring point on a bridge structure under the action of a load by using image acquisition equipment, and synchronously acquiring an actually-measured acceleration response of the measuring point;
s2, continuously tracking the positions of the measuring points based on a template matching algorithm according to the vibration time-course image sequence to obtain a displacement time-course curve of the measuring points under the image coordinates;
s3, carrying out numerical integration on the measured acceleration response of the measuring point to obtain the dynamic displacement of the measuring point;
s4, performing band-pass filtering on the displacement time-course curve and the dynamic displacement of the measuring point under the image coordinate, and fitting a scale factor by using a least square method;
and S5, carrying out low-pass filtering on the displacement time-course curve of the measuring point under the image coordinate, carrying out high-pass filtering on the dynamic displacement, and carrying out data fusion on the dynamic displacement and the dynamic displacement to obtain the high-precision displacement of the measuring point.
The specific implementation manner of the step S1 is as follows:
arranging an acceleration sensor at a target measuring point of a target monitoring bridge, and acquiring vertical vibration acceleration of the bridge under the action of load by using a matched data acquisition system to obtain an actually-measured acceleration response of the acquisition measuring point; fixing the image acquisition equipment by using a tripod at a stable position outside the bridge, and adjusting the angle of the equipment to enable the measuring point to be positioned in the middle of a view field to obtain a vibration time-course image sequence.
The specific implementation manner of step S3 is as follows:
according to the formula:
μ=46.81N -1.95
C=(L T L+μ 2 I) -1 L T L a'
obtaining a regularization coefficient mu and a dynamic displacement u of a measuring point a' =Ca'(Δt) 2 (ii) a Wherein L = L a' L c ;L a' The diagonal matrix of order 2k +1 with the values of the head and tail elements of the diagonalThe rest elements are 1; l is c Is (2k + 1) (2k + 3) order linear operator; 2k +1 is the number of acceleration data points in a calculation time window; delta t is the time interval of two adjacent sampling points; n is the total number of acceleration data points; i is an identity matrix of order (2k + 3); c is a coefficient matrix; (.) T Represents a transpose of a matrix; a' represents the measured acceleration vector.
The specific implementation manner of step S4 is as follows:
s4-1, according to a formula:
H 3 =y 1 ζ 0 (x)+y 2 ζ 1 (x)+m 0 η 0 (x)+m 1 η 1 (x)
according to u im (t) and u a' Time-synchronized cross-correlation coefficient R xy (τ) making u a' Realizing time synchronization with coordinate sub-pixel displacement; obtaining the polynomial pair u using a cubic Hermite interpolation im (t im ) Upsampling is performed, and sum u a' Sub-pixel displacement u of coordinates with same sampling time interval im (t); wherein, The coordinates of two points to be interpolated are respectively (x) 1 ,y 1 ) And (x) 2 ,y 2 );m 0 Is a point x 1 The derivative value of (d); m is a unit of 1 Is a point x 2 The derivative value of (d); y is 1 Is a point x 1 A function value of (b); y is 2 Is a point x 2 A function value of (b); t is the measurement time of the random signals x (T) and y (T); tau is the time difference of the two signals; t represents time t;
s4-2, respectively carrying out band-pass filtering on the coordinate sub-pixel displacement of the measuring point and the dynamic displacement of the measuring point, and enabling the lower cut-off frequencySetting the lower limit of the frequency response range of the acceleration sensor; will cut off the frequencySetting the sampling frequency of the image acquisition equipment to be 1/10-1/2 to obtain the sub-pixel displacement of the same-frequency component coordinateAnd dynamic displacement of measuring point
S4-3, according to a formula:
a scaling factor SF is obtained.
The specific implementation manner of step S5 is as follows:
s5-1, obtaining a pair of complementary filters H with amplitude sum of 1 and phase sum of 0 L And H H ;
S5-2, according to a formula:
for u is paired im (t) Using a Filter H L Low-pass filtering is carried out to obtain the sub-pixel displacement of the quasi-static image coordinateFor u to u a' Using filters H H Carrying out high-pass filtering to obtain dynamic displacement u Dtnamic (t);C L And C H Is the filter coefficient; j represents the j-th acceleration data point number in a calculation time window;
s5-3, according to a formula:
and obtaining the high-precision displacement u (t) of the measuring point.
As shown in fig. 2, the specific implementation manner of step S2 is as follows:
s2-1, converting the vibration time-course image sequence into a gray level image sequence;
s2-2, framing an image subset containing target measuring points in the first frame of image, namely a template;
s2-3, predicting the motion range of the template according to experience, and framing an image subset which can contain the motion range of the template, namely a region of interest (ROI);
s2-4, according to a formula:
obtaining a ROI normalization phase of the region of interestCoefficient of correlation(ii) a Wherein T (m, n) is a template selected from the first frame image;the gray level average value of all pixel points in the template T (m, n); m is a group of i,j (m, n) is a region of interest ROI of the image to be subjected to correlation coefficient calculation;covering M for template T (M, n) i,j The gray level mean value of all pixel points on the image subset of the (m, n) part; (i, j) is the coordinates after template translation; m is the width of the template, and N is the height of the template;
s2-6, according to the formula:
F=ax 2 +by 2 +cx+dy+e
obtaining the coordinates (x) of the sub-pixel best matching point 0 ,y 0 ) (ii) a Wherein A is the amplitude of the correlation coefficient; sigma x Is the standard deviation in the x direction; sigma y Standard deviation in the y-direction; x is the number of 0 The x coordinate of the optimal matching point of the sub-pixel is obtained; y is 0 The y coordinate of the optimal matching point of the sub-pixel is obtained; i (x, y) at x = x 0 And y = y 0 The time reaches the maximum value; f = ln I (x, y);
s2-7, obtaining the sub-pixel coordinates of the target measuring point of each frame image in the subsequent frames, and connecting the sub-pixel coordinates of each frame in series in time to obtain a sub-pixel displacement time course curve of the target measuring point in an image coordinate system, and marking the curve as u im (t im )。
In an embodiment of the present invention, a specific process for calculating the coordinates of the sub-pixel best matching point includes:
the two-dimensional gaussian distribution function is:
in the formula, I (x, y) is a correlation coefficient of a pixel point; a is the amplitude of the correlation coefficient; sigma x And σ y Standard deviations in the x and y directions, respectively; x is the number of 0 And y 0 X and y coordinates of the center of the distribution of the correlation coefficients, respectively, I (x, y) at x = x 0 And y = y 0 The time reaches a maximum value. The logarithm at both ends of the above formula can be obtained:
F=ax 2 +by 2 +cx+dy+e
in the formula [ a, b, c, d, e ]]Is the parameter to be estimated. In addition (x) i ,y i ,F(x i ,y i ) ) are the coordinates of the 5 x 5 pixel points in the neighborhood of the best integer pixel matching point and their correlation coefficients, i.e., all the coordinates for the gaussian fit points. The gaussian fitting is performed by using a least square method, and the objective function can be obtained as follows:
ε 2 =min∑(ax 2 +by 2 +cx+dy+e-F) 2
the above formula is further rewritten as:
and obtaining a positive definite matrix, and solving [ a, b, c, d, e ] by using a Householder transformation method. The sub-pixel best matching point coordinates are:
according to the method, a low-frequency component in visual displacement is used as quasi-static displacement, a high-frequency part of displacement obtained by acceleration integration is used as dynamic displacement, and the high-frequency part and the dynamic displacement are subjected to data fusion to obtain high-precision displacement; the method has the advantages of simple process, small operand and higher precision.
Claims (6)
1. A bridge high-precision displacement monitoring method fusing computer vision and acceleration is characterized by comprising the following steps of:
s1, shooting a vibration time-course image sequence of a measuring point on a bridge structure under the action of a load by using image acquisition equipment, and synchronously acquiring an actually-measured acceleration response of the measuring point;
s2, continuously tracking the positions of the measuring points based on a template matching algorithm according to the vibration time course image sequence to obtain a displacement time course curve of the measuring points under the image coordinates;
s3, carrying out numerical integration on the measured acceleration response of the measuring point to obtain the dynamic displacement of the measuring point;
s4, performing band-pass filtering on the displacement time-course curve and the dynamic displacement of the measuring point under the image coordinate, and fitting a scale factor by using a least square method;
and S5, carrying out low-pass filtering on the displacement time-course curve of the measuring point under the image coordinate, carrying out high-pass filtering on the dynamic displacement, and carrying out data fusion on the dynamic displacement and the dynamic displacement to obtain the high-precision displacement of the measuring point.
2. The method for monitoring the high-precision displacement of the bridge with the fusion of the computer vision and the acceleration according to claim 1, wherein the step S1 is realized in the following specific manner:
arranging an acceleration sensor at a target measuring point of a target monitoring bridge, and acquiring vertical vibration acceleration of the bridge under the action of load by using a matched data acquisition system to obtain acceleration response of the acquisition measuring point; and fixing the image acquisition equipment by using a tripod at a stable position outside the bridge, and adjusting the angle of the equipment to enable the measuring point to be positioned in the middle of a view field to obtain a vibration time-course image sequence.
3. The method for monitoring the high-precision displacement of the bridge with the fusion of the computer vision and the acceleration according to claim 2, wherein the step S2 is realized in the following specific manner:
s2-1, converting the vibration time-course image sequence into a gray level image sequence;
s2-2, framing an image subset containing target measuring points in the first frame of image, namely a template;
s2-3, predicting the motion range of the template according to experience, and framing an image subset which can contain the motion range of the template, namely a region of interest (ROI);
s2-4, according to a formula:
obtaining a ROI normalization correlation coefficientWherein T (m, n) is a template selected from the first frame image;the gray level average value of all pixel points in the template T (m, n); m i,j (m, n) is a region of interest ROI of the image to be subjected to correlation coefficient calculation;covering M for template T (M, n) i,j The gray level mean value of all pixel points on the image subset of the (m, n) part; (i, j) is coordinates after the template is translated; m is the width of the template, and N is the height of the template;
s2-6, according to the formula:
F=ax 2 +by 2 +cx+dy+e
obtaining the sub-pixel best matching point coordinate (x) 0 ,y 0 ) (ii) a Wherein A is the amplitude of the correlation coefficient; sigma x Is the standard deviation in the x-direction; sigma y Is the standard deviation in the y-direction; x is a radical of a fluorine atom 0 The x coordinate of the optimal matching point of the sub-pixel is obtained; y is 0 The y coordinate of the optimal matching point of the sub-pixel is obtained; i (x, y) at x = x 0 And y = y 0 The time reaches the maximum value; f = lnI (x, y);
s2-7, obtaining the sub-pixel coordinates of the target measuring point of each frame image in the subsequent frames, and connecting the sub-pixel coordinates of each frame in series in time to obtain a sub-pixel displacement time course curve of the target measuring point in an image coordinate system, and marking the curve as u im (t im )。
4. The method for monitoring the high-precision displacement of the bridge with the fusion of the computer vision and the acceleration according to claim 3, wherein the step S3 is realized in the following specific manner:
according to the formula:
μ=46.81N -1.95
C=(L T L+μ 2 I) -1 L T L a'
obtaining a regularization coefficient mu and a dynamic displacement u of a measuring point a' =Ca'(Δt) 2 (ii) a Wherein L = L a' L c ;L a' Is 2k +1 order diagonal matrix with diagonal head and tail element values ofThe rest elements are 1; l is c Is (2k + 1) (2k + 3) order linear operator; 2k +1 is the number of acceleration data points in a calculation time window; delta t is the time interval of two adjacent sampling points; n is the total number of acceleration data points; i is an identity matrix of order (2k + 3); c is a coefficient matrix; (.) T Represents a transpose of a matrix; a' represents the measured acceleration vector.
5. The method for monitoring the high-precision displacement of the bridge with the fusion of the computer vision and the acceleration according to claim 4, wherein the step S4 is realized in the following specific manner:
s4-1, according to a formula:
H 3 =y 1 ζ 0 (x)+y 2 ζ 1 (x)+m 0 η 0 (x)+m 1 η 1 (x)
according to u im (t) and u a' Time-synchronized cross-correlation coefficient R xy (τ) making u a' Realizing time synchronization with coordinate sub-pixel displacement; obtaining the polynomial pair u using the cubic Hermite interpolation im (t im ) Upsampling is performed, and u is summed a' Sub-pixel displacement u of coordinates with same sampling time interval im (t); wherein the content of the first and second substances, the coordinates of two points to be interpolated are respectively (x) 1 ,y 1 ) And (x) 2 ,y 2 );m 0 Is a point x 1 A derivative value of (d); m is a unit of 1 Is a point x 2 A derivative value of (d); y is 1 Is a point x 1 A function value of (b); y is 2 Is a point x 2 A function value of (b); t is the measurement time of the random signals x (T) and y (T); tau is the time difference of the two signals; t represents time t;
s4-2, respectively carrying out band-pass filtering on the coordinate sub-pixel displacement of the measuring point and the dynamic displacement of the measuring point, and enabling the lower cut-off frequencySetting the lower limit of the frequency response range of the acceleration sensor; will cut off the frequencySetting the sampling frequency of the image acquisition equipment to be 1/10-1/2 to obtain the sub-pixel displacement of the same-frequency component coordinateDynamic displacement of sum point
S4-3, according to a formula:
a scaling factor SF is obtained.
6. The method for monitoring the high-precision displacement of the bridge with the fusion of the computer vision and the acceleration according to claim 5, wherein the step S5 is realized in the following specific manner:
s5-1, obtaining a pair of complementary filters H with amplitude sum of 1 and phase sum of 0 L And H H ;
S5-2, according to a formula:
for u is paired im (t) use of a filter H L Low-pass filtering is carried out to obtain the sub-pixel displacement of the quasi-static image coordinateFor u is paired a' Using filters H H Carrying out high-pass filtering to obtain dynamic displacement u Dtnamic (t);C L And C H Is the filter coefficient; j represents the j-th acceleration data point number in a calculation time window;
s5-3, according to a formula:
and obtaining the high-precision displacement u (t) of the measuring point.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211378696.2A CN115752250A (en) | 2022-11-04 | 2022-11-04 | Bridge high-precision displacement monitoring method fusing computer vision and acceleration |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211378696.2A CN115752250A (en) | 2022-11-04 | 2022-11-04 | Bridge high-precision displacement monitoring method fusing computer vision and acceleration |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115752250A true CN115752250A (en) | 2023-03-07 |
Family
ID=85356464
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211378696.2A Pending CN115752250A (en) | 2022-11-04 | 2022-11-04 | Bridge high-precision displacement monitoring method fusing computer vision and acceleration |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115752250A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116753843A (en) * | 2023-05-19 | 2023-09-15 | 北京建筑大学 | Engineering structure dynamic displacement monitoring method, device, equipment and storage medium |
CN117953036A (en) * | 2024-03-22 | 2024-04-30 | 河南敦喏建筑工程有限公司 | Road and bridge foundation settlement displacement monitoring system and method |
-
2022
- 2022-11-04 CN CN202211378696.2A patent/CN115752250A/en active Pending
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116753843A (en) * | 2023-05-19 | 2023-09-15 | 北京建筑大学 | Engineering structure dynamic displacement monitoring method, device, equipment and storage medium |
CN116753843B (en) * | 2023-05-19 | 2024-04-12 | 北京建筑大学 | Engineering structure dynamic displacement monitoring method, device, equipment and storage medium |
CN117953036A (en) * | 2024-03-22 | 2024-04-30 | 河南敦喏建筑工程有限公司 | Road and bridge foundation settlement displacement monitoring system and method |
CN117953036B (en) * | 2024-03-22 | 2024-06-11 | 河南敦喏建筑工程有限公司 | Road and bridge foundation settlement displacement monitoring system and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115752250A (en) | Bridge high-precision displacement monitoring method fusing computer vision and acceleration | |
Zhang et al. | Three-dimensional shape measurements of specular objects using phase-measuring deflectometry | |
CN113624122B (en) | Bridge deformation monitoring method fusing GNSS data and InSAR technology | |
CN108759709B (en) | White light interference three-dimensional reconstruction method suitable for surface morphology detection | |
CN107621636B (en) | PSI-based large-scale railway bridge health monitoring method | |
CN103454636B (en) | Differential interferometric phase estimation method based on multi-pixel covariance matrixes | |
CN112284332B (en) | High-rise building settlement monitoring result three-dimensional positioning method based on high-resolution INSAR | |
Bai et al. | UAV based accurate displacement monitoring through automatic filtering out its camera's translations and rotations | |
CN114187330A (en) | Structural micro-amplitude vibration working mode analysis method based on optical flow method | |
Xiao et al. | Large-scale structured light 3D shape measurement with reverse photography | |
CN116363121A (en) | Computer vision-based inhaul cable force detection method, system and device | |
CN114812491B (en) | Transmission line earth surface deformation early warning method and device based on long-time sequence analysis | |
CN102062572B (en) | Joint transform correlator (JTC)-based high-accuracy photoelectric hybrid image motion measurement device and method | |
Cai et al. | Estimating small structural motions from multi-view video measurement | |
CN116183226A (en) | Bearing test bed vibration displacement measurement and modal analysis algorithm based on phase | |
CN201680816U (en) | High-precision photo-electricity mixing image motion measuring device based on JTC | |
CN104077772A (en) | MEMS in-plane micro-motion measurement method based on blurred image correlation and fractal wavelet interpolation | |
CN114280608B (en) | Method and system for removing DInSAR elevation-related atmospheric effect | |
CN113483879B (en) | Small satellite flutter high-speed video measurement method | |
Liu et al. | Modeling, measurement, and calibration of three-axis integrated aerial camera pointing errors | |
CN115494500A (en) | Goaf rapid detection method and system based on remote sensing interferometry and application | |
CN115326025A (en) | Binocular image measuring and predicting method for sea waves | |
Zhang et al. | A 3D measurement method with accurate boundaries based on mutation feature detection of laser stripes | |
Li et al. | A structured light vision sensor for online measurement of steel-plate width | |
Chen et al. | Long distance video camera measurements of structures |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |