CN112906095B - Bridge modal identification method and system based on laser stripe center tracking - Google Patents

Bridge modal identification method and system based on laser stripe center tracking Download PDF

Info

Publication number
CN112906095B
CN112906095B CN202011512922.2A CN202011512922A CN112906095B CN 112906095 B CN112906095 B CN 112906095B CN 202011512922 A CN202011512922 A CN 202011512922A CN 112906095 B CN112906095 B CN 112906095B
Authority
CN
China
Prior art keywords
bridge
laser
coordinate
point
laser stripe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011512922.2A
Other languages
Chinese (zh)
Other versions
CN112906095A (en
Inventor
吴桐
唐亮
罗东
周志祥
陈虹侨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunnan Jiaotou Pulan Expressway Co ltd
Chongqing Jiaotong University
Original Assignee
Chongqing Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Jiaotong University filed Critical Chongqing Jiaotong University
Priority to CN202011512922.2A priority Critical patent/CN112906095B/en
Publication of CN112906095A publication Critical patent/CN112906095A/en
Application granted granted Critical
Publication of CN112906095B publication Critical patent/CN112906095B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Abstract

The invention provides a bridge mode identification method based on laser stripe center tracking, which comprises the following steps: projecting a laser line to the structural surface of the bridge, and collecting image information of laser stripes formed by the laser line on the surface of the bridge; processing the image of the laser stripe, and extracting the position of the central point of the laser stripe; fitting the horizontal coordinate or the vertical coordinate of the central point of the laser stripe with the time change state to form a laser line displacement time course curve; determining the time domain response of continuous points on the bridge surface in the length range of the laser line projected on the bridge surface according to the laser line displacement time-course curve; identifying a bridge modal parameter according to the time domain response of the continuous points on the surface of the bridge; by the method, accurate bridge modal parameters can be obtained, complex monitoring equipment arrangement on the bridge is not needed, labor cost and equipment cost are effectively saved, the method does not depend on illumination and background conditions, and accuracy is high.

Description

Bridge modal identification method and system based on laser stripe center tracking
Technical Field
The invention relates to the field of bridges, in particular to a bridge mode identification method and a bridge mode identification system based on laser stripe center tracking.
Background
The modal parameters of the structure mainly comprise natural frequency, vibration mode, damping ratio and the like, the change of the modal parameters can indirectly reflect the change of physical parameters (structure mass, structure rigidity, damping and the like) of the structure, and the modal parameters are main parameters for determining the dynamic characteristics of the structure and the premise and the basis for identifying the structural damage and evaluating the state.
The traditional modal parameter identification method is mainly characterized in that a contact sensor is arranged on the surface of a structure, vibration response signals (displacement, speed, acceleration and the like) of the structure are picked up, modal analysis is carried out on the signals to obtain modal parameters of the structure, the method can only carry out measurement at a few discrete positions, wiring is usually required to carry out data transmission or power supply, the implementation difficulty on large-scale bridges on site is high, and the cost is high.
In recent years, Digital Image Correlation (DIC) has been widely used in the industrial field due to its advantages of non-contact and large-scale measurement, and by tracking the motion of external features of a structure in a sequence of Digital images recorded by a camera, the full-field displacement and deformation of the structure can be measured in a non-contact manner, and thus the structure can be subjected to modal analysis. DIC technology based on target tracking usually requires manual spraying speckles or sticking identification points near the surface of the structure to obtain a traceable external target, which may not be allowed on a practical bridge; the method can realize non-target tracking by applying edge detection and motion amplification algorithms, but has certain requirements on illumination and background conditions. If the illumination condition changes in the measurement process, the method has poor effect.
Therefore, in order to solve the above technical problems, a new technical means is continuously proposed.
Disclosure of Invention
In view of the above, the present invention provides a bridge modal identification method and system based on laser fringe center tracking, which perform identification and estimation of bridge modal parameters based on image information, so as to obtain accurate bridge modal parameters, and at the same time, do not need to perform complicated arrangement of monitoring equipment on a bridge, effectively save labor cost and equipment cost, and do not depend on illumination and background conditions, and have high accuracy.
The invention provides a bridge mode identification method based on laser stripe center tracking, which comprises the following steps:
s1, projecting a laser line to the surface of a structure of a bridge, and collecting image information of laser stripes formed by the laser line on the surface of the bridge; after the laser line emitted by the laser is projected on the surface of the bridge, a line is formed on human vision, however, a laser stripe with a certain width is presented in the image;
s2, processing the image of the laser stripe, and extracting the position of the central point of the laser stripe;
s3, fitting the time-varying state of the abscissa or the ordinate of the central point of the laser stripe to form a laser line displacement time-course curve;
s4, determining time domain response of continuous points on the bridge surface within the length range of the laser line projected on the bridge surface according to the laser line time course curve;
and S5, identifying the modal parameters of the bridge according to the time domain response of the continuous points on the surface of the bridge.
Further, step S2 specifically includes:
s21, carrying out gray level processing on the image of the laser stripe, and expressing the image after the gray level processing by adopting a gray level matrix:
Figure BDA0002846973180000021
wherein g (x, y) is the gray value of different pixel points in the gray image, and M and N represent that the image is formed by M multiplied by N pixels;
s22, finding out the maximum gray value in a column or a row in the gray matrix, wherein the point corresponding to the maximum gray value is a light intensity peak point, and recording the coordinate of the light intensity peak point;
s23, selecting a 1:2 window, fitting column or row pixel points in a gray matrix to form a secondary curve by adopting a least square method for the stripe with the laser stripe width of s, and obtaining a secondary curve equation:
f(y)=ay2+ by + c (2); if the pixel points of the laser stripes are represented by rows in the gray matrix, fitting the pixel points of the rows;
s24, substituting the coordinates of the pixel points within the width s of the laser stripe into a quadratic curve equation, and performing matrixing processing to obtain a matrix F: f is YB, wherein:
Figure BDA0002846973180000031
obtaining a matrix B: b ═ YTY)-1YTF (4);
S25, determining a light intensity distribution function of the laser stripes:
Figure BDA0002846973180000032
wherein A is the gray value of the laser stripe, y is the y coordinate of the rectangular coordinate system on the image plane,
Figure BDA0002846973180000033
the y-axis coordinate value of the light intensity distribution center in the rectangular coordinate system is shown, and sigma is the width of the laser stripe intensity distribution;
taking logarithm operation on the light intensity distribution function:
Figure BDA0002846973180000034
order: f (y) lni (y),
Figure BDA0002846973180000035
the logarithmic expression of the light intensity distribution function is rewritten as:
F(y)=a0+a1y+a2y2 (7);
establishing an objective function:
Figure BDA0002846973180000041
wherein 2N +1 represents the number of strongly distributed data points; order to
Figure BDA0002846973180000042
The matrix equation can be derived:
Figure BDA0002846973180000043
obtaining the position of the center point of the laser stripe by a Householder conversion method:
Figure BDA0002846973180000044
further, step S4 specifically includes:
determining a coordinate system, and determining an initial coordinate A of a central point of a laser stripe after a laser line is projected on the surface of the bridge and an initial coordinate B of a corresponding point of the central point of the laser stripe on an image surface; recording the coordinate position C of the installation point of the camera;
after the bridge surface is deformed, determining the coordinate A 'of the displaced central point of the laser stripe and the coordinate B' of the corresponding point on the image surface,
in the coordinate system, the crossing point B is taken as a parallel line of the AA 'connection line, the intersection point of the parallel line and the connection line of the camera coordinate C and the coordinate a' of the center point of the laser stripe is Bi, and the distance between the coordinate B and the intersection point Bi is:
Figure BDA0002846973180000051
a is the distance between AA',
coordinates B and B' define a straight line expressed as:
Figure BDA0002846973180000052
wherein alpha is an included angle between a straight line BB' and a straight line BBi;
the straight line passing through point C and point A' is represented as:
Figure BDA0002846973180000053
wherein, B is the distance between the coordinate origin O and the point B, and C is the distance between the coordinate origin and the camera position coordinate C;
and (12) and (13) are combined to determine the coordinates of B':
Figure BDA0002846973180000054
calculate the length d of line segment BB':
Figure BDA0002846973180000055
substituting equation (11) into equation (15):
Figure BDA0002846973180000056
the time-course vibration curve of the laser stripe is obtained based on equation (16).
Correspondingly, the bridge modal identification system based on laser stripe center tracking comprises a laser source, a camera, an image processing module, a time-course curve fitting module, a bridge vibration time domain response extraction module and a bridge modal parameter identification module;
the laser source is used for emitting laser lines to the surface of the bridge and forming laser stripes on the surface of the bridge;
the camera is arranged outside the bridge, is positioned on the outer side below the bridge and is used for acquiring and outputting a laser stripe image on the surface of the bridge;
the image processing module is used for receiving the image information output by the camera, carrying out gray level processing and then extracting the central point position of the laser stripe in the image;
the time-course curve fitting module receives the central point position of the laser stripe output by the image processing module and fits to form a time-course curve of the central point in a vibration state;
the bridge vibration time domain response extraction module receives the time course curve output by the time course curve fitting module and extracts the vibration time domain response of the bridge;
and the bridge modal parameter identification module receives the vibration time domain response of the bridge output by the bridge vibration time domain response extraction module and identifies the bridge modal parameters.
The invention has the beneficial effects that: according to the invention, the identification and estimation of the bridge modal parameters are carried out based on the image information, so that the accurate bridge modal parameters can be obtained, the complex arrangement of monitoring equipment for the bridge is not needed, the labor cost and the equipment cost are effectively saved, the dependence on illumination and background conditions is avoided, and the accuracy is high.
Drawings
The invention is further described below with reference to the following figures and examples:
FIG. 1 is a flow chart of the present invention.
Fig. 2 is a schematic structural diagram of the present invention.
FIG. 3 is a schematic diagram of the time curve fitting transformation of the present invention.
Fig. 4 is a schematic view of the arrangement structure of the present invention.
Detailed Description
The following detailed description is made in conjunction with the accompanying drawings:
the invention provides a bridge mode identification method based on laser stripe center tracking, which comprises the following steps:
s1, projecting a laser line to the surface of a structure of a bridge, and collecting image information of laser stripes formed by the laser line on the surface of the bridge;
s2, processing the image of the laser stripe, and extracting the position of the central point of the laser stripe;
s3, fitting the time-varying state of the abscissa or the ordinate of the central point of the laser stripe to form a laser line displacement curve; wherein the displacement of the laser line reflects the vibration state of the bridge;
s4, determining time domain response of continuous points on the bridge surface within the length range of the laser line projected on the bridge surface according to the laser line displacement time-course curve;
s5, identifying bridge modal parameters according to the time domain response of the continuous points on the surface of the bridge; by the method, accurate bridge modal parameters can be obtained, complex monitoring equipment arrangement on the bridge is not needed, labor cost and equipment cost are effectively saved, the method does not depend on illumination and background conditions, and accuracy is high.
In this embodiment, step S2 specifically includes:
s21, carrying out gray level processing on the image of the laser stripe, and expressing the image after the gray level processing by adopting a gray level matrix:
Figure BDA0002846973180000071
wherein g (x, y) is the gray value of different pixel points in the gray image, and M and N represent that the image is formed by M multiplied by N pixels;
s22, finding out the maximum gray value in a column or a row in the gray matrix, wherein the point corresponding to the maximum gray value is a light intensity peak point, and recording the coordinate of the light intensity peak point;
s23, selecting a 1:2 window, fitting column or row pixel points in a gray matrix to form a secondary curve by adopting a least square method for the stripe with the laser stripe width of s, and obtaining a secondary curve equation:
f(y)=ay2+by+c (2);
s24, substituting the coordinates of the pixel points within the width s of the laser stripe into a quadratic curve equation, and performing matrixing processing to obtain a matrix F: f is YB, wherein:
Figure BDA0002846973180000081
obtaining a matrix B: b ═ YTY)-1YTF (4);
S25, determining a light intensity distribution function of the laser stripes:
Figure BDA0002846973180000082
wherein A is the gray value of the laser stripe, y is the y coordinate of the rectangular coordinate system on the image plane,
Figure BDA0002846973180000083
is light intensity distribution centered on straightY-axis coordinate value in the angular coordinate system, wherein sigma is the width of the intensity distribution of the laser stripe;
taking logarithm operation on the light intensity distribution function:
Figure BDA0002846973180000084
order: f (y) lni (y),
Figure BDA0002846973180000085
the logarithmic expression of the light intensity distribution function is rewritten as:
F(y)=a0+a1y+a2y2 (7);
establishing an objective function:
Figure BDA0002846973180000086
wherein 2N +1 represents the number of strongly distributed data points; order to
Figure BDA0002846973180000087
The matrix equation can be derived:
Figure BDA0002846973180000091
obtaining the position of the center point of the laser stripe by a Householder conversion method:
Figure BDA0002846973180000092
by the method, the position of the center point of the laser stripe can be accurately determined.
In this embodiment, step S4 specifically includes:
determining a coordinate system, and determining an initial coordinate A of a central point of a laser stripe after a laser line is projected on the surface of the bridge and an initial coordinate B of a corresponding point of the central point of the laser stripe on an image surface; recording the coordinate position C of the installation point of the camera;
after the bridge surface is deformed, determining the coordinate A 'of the displaced central point of the laser stripe and the coordinate B' of the corresponding point on the image surface,
in the coordinate system, the crossing point B is taken as a parallel line of the AA 'connection line, the intersection point of the parallel line and the connection line of the camera coordinate C and the coordinate a' of the center point of the laser stripe is Bi, and the distance between the coordinate B and the intersection point Bi is:
Figure BDA0002846973180000093
a is the distance between AA',
coordinates B and B' define a straight line expressed as:
Figure BDA0002846973180000094
wherein alpha is an included angle between a straight line BB' and a straight line BBi;
the straight line passing through point C and point A' is represented as:
Figure BDA0002846973180000101
wherein, B is the distance between the coordinate origin O and the point B, and C is the distance between the coordinate origin and the camera position coordinate C; the above-mentioned geometrical relationship is shown in FIG. 3;
and (12) and (13) are combined to determine the coordinates of B':
Figure BDA0002846973180000102
calculate the length d of line segment BB':
Figure BDA0002846973180000103
substituting equation (11) into equation (15):
Figure BDA0002846973180000104
obtaining a time course vibration curve of the laser stripes based on the formula (16), obtaining the time course curve, obtaining the time domain response of the central points of the laser stripes through the existing algorithm, and obtaining the time domain response of the whole bridge surface through the time domain response of the central points of the laser stripes; then modal parameter identification is carried out based on the existing algorithm, such as:
the time domain response is represented using a linear combination:
Figure BDA0002846973180000105
j is the modal number of the structure, phijFor the jth modal shape vector of the structure,
Figure BDA0002846973180000106
in the displacement mode, qj(t) is the modal coordinates at time t. At coordinate x, assuming that the vibration of the bridge is entirely caused by an excitation fs applied at a point on the beam
Figure BDA0002846973180000107
And fs can be expressed as:
Figure BDA0002846973180000111
wherein
Figure BDA0002846973180000112
And Fs are each
Figure BDA0002846973180000113
And f, Fourier transform of fs, wherein omega represents frequency, and an experimental modal analysis algorithm is adopted to analyze H to obtain modal parameters of the bridge.
Correspondingly, the bridge modal identification system based on laser stripe center tracking comprises a laser source, a camera, an image processing module, a time-course curve fitting module, a bridge vibration time domain response extraction module and a bridge modal parameter identification module;
the laser source is used for emitting laser lines to the surface of the bridge and forming laser stripes on the surface of the bridge;
the camera is arranged outside the bridge, is positioned on the outer side below the bridge and is used for acquiring and outputting a laser stripe image on the surface of the bridge;
the image processing module is used for receiving the image information output by the camera, carrying out gray level processing and then extracting the central point position of the laser stripe in the image;
the time-course curve fitting module receives the central point position of the laser stripe output by the image processing module and fits to form a time-course curve of the central point in a vibration state;
the bridge vibration time domain response extraction module receives the time course curve output by the time course curve fitting module and extracts the vibration time domain response of the bridge;
and the bridge modal parameter identification module receives the vibration time domain response of the bridge output by the bridge vibration time domain response extraction module and identifies the bridge modal parameters.
Finally, the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made to the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, and all of them should be covered in the claims of the present invention.

Claims (2)

1. A bridge mode identification method based on laser stripe center tracking is characterized in that: the method comprises the following steps:
s1, projecting a laser line to the surface of a structure of a bridge, and collecting image information of laser stripes formed by the laser line on the surface of the bridge;
s2, processing the image of the laser stripe, and extracting the position of the central point of the laser stripe;
s3, fitting the time-varying state of the abscissa or the ordinate of the central point of the laser stripe to form a laser line displacement time-course curve;
s4, determining time domain response of continuous points on the bridge surface within the length range of the laser line projected on the bridge surface according to the laser line displacement time-course curve;
s5, identifying bridge modal parameters according to the time domain response of the continuous points on the surface of the bridge;
step S2 specifically includes:
s21, carrying out gray level processing on the image of the laser stripe, and expressing the image after the gray level processing by adopting a gray level matrix:
Figure FDA0003521526550000011
wherein g (x, y) is the gray value of different pixel points in the gray image, and M and N represent that the image is formed by M multiplied by N pixels;
s22, finding out the maximum gray value in a column or a row in the gray matrix, wherein the point corresponding to the maximum gray value is a light intensity peak point, and recording the coordinate of the light intensity peak point;
s23, selecting 1:2, for the stripe with the laser stripe width of s, fitting the column or row pixel points in the gray matrix by adopting a least square method to form a quadratic curve, and obtaining a quadratic curve equation:
f(y)=ay2+by+c(2)
s24, substituting the coordinates of the pixel points within the width s of the laser stripe into a quadratic curve equation, and performing matrixing processing to obtain a matrix F: f is YB, wherein:
Figure FDA0003521526550000021
obtaining a matrix B: b ═ YTY)-1YTF (4);
S25, determining a light intensity distribution function of the laser stripes:
Figure FDA0003521526550000022
wherein A is the gray value of the laser stripe, y is the y coordinate of the rectangular coordinate system on the image plane,
Figure FDA0003521526550000023
the y-axis coordinate value of the light intensity distribution center in the rectangular coordinate system is shown, and sigma is the width of the laser stripe intensity distribution;
taking logarithm operation on the light intensity distribution function:
Figure FDA0003521526550000024
order: f (y) lni (y),
Figure FDA0003521526550000025
the logarithmic expression of the light intensity distribution function is rewritten as:
F(y)=a0+a1y+a2y2 (7);
establishing an objective function:
Figure FDA0003521526550000026
wherein 2N +1 represents the number of strongly distributed data points; order to
Figure FDA0003521526550000031
The matrix equation can be derived:
Figure FDA0003521526550000032
obtaining the position of the center point of the laser stripe by a Householder conversion method:
Figure FDA0003521526550000033
step S4 specifically includes:
determining a coordinate system, and determining an initial coordinate A of a central point of a laser stripe after a laser line is projected on the surface of the bridge and an initial coordinate B of a corresponding point of the central point of the laser stripe on an image surface; recording the coordinate position C of the installation point of the camera;
after the bridge surface is deformed, determining the coordinate A 'of the displaced central point of the laser stripe and the coordinate B' of the corresponding point on the image surface,
in the coordinate system, the crossing point B is taken as a parallel line of the AA 'connection line, the intersection point of the parallel line and the connection line of the camera coordinate C and the coordinate a' of the center point of the laser stripe is Bi, and the distance between the coordinate B and the intersection point Bi is:
Figure FDA0003521526550000034
a is the distance between AA',
coordinates B and B' define a straight line expressed as:
Figure FDA0003521526550000035
wherein alpha is an included angle between a straight line BB' and a straight line BBi;
the straight line passing through point C and point A' is represented as:
Figure FDA0003521526550000041
wherein, B is the distance between the coordinate origin O and the point B, and C is the distance between the coordinate origin and the camera position coordinate C;
and (12) and (13) are combined to determine the coordinates of B':
Figure FDA0003521526550000042
calculate the length d of line segment BB':
Figure FDA0003521526550000043
substituting equation (11) into equation (15):
Figure FDA0003521526550000044
the time-course vibration curve of the laser stripe is obtained based on equation (16).
2. The utility model provides a bridge mode identification system based on laser stripe center is tracked which characterized in that: the bridge vibration time domain response extraction system comprises a laser source, a camera, an image processing module, a time course curve fitting module, a bridge vibration time domain response extraction module and a bridge modal parameter identification module;
the laser source is used for emitting laser lines to the surface of the bridge and forming laser stripes on the surface of the bridge;
the camera is arranged outside the bridge, is positioned on the outer side below the bridge and is used for acquiring and outputting a laser stripe image on the surface of the bridge;
the image processing module is used for receiving the image information output by the camera, carrying out gray level processing and then extracting the central point position of the laser stripe in the image;
the time-course curve fitting module receives the central point position of the laser stripe output by the image processing module and fits to form a time-course curve of the central point in a vibration state;
the bridge vibration time domain response extraction module receives the time course curve output by the time course curve fitting module and extracts the vibration time domain response of the bridge;
the bridge modal parameter identification module receives the vibration time domain response of the bridge output by the bridge vibration time domain response extraction module and identifies the bridge modal parameters;
the identification system identifies parameters based on the following method:
carrying out gray level processing on the image of the laser stripe, and expressing the image after the gray level processing by adopting a gray level matrix:
Figure FDA0003521526550000051
wherein g (x, y) is the gray value of different pixel points in the gray image, and M and N represent that the image is formed by M multiplied by N pixels;
finding out the maximum gray value in a column or a row in the gray matrix, wherein the point corresponding to the maximum gray value is a light intensity peak point, and recording the coordinate of the light intensity peak point;
selecting 1:2, for the stripe with the laser stripe width of s, fitting the column or row pixel points in the gray matrix by adopting a least square method to form a quadratic curve, and obtaining a quadratic curve equation:
f(y)=ay2+by+c(2)
substituting the coordinates of the pixel points within the width s of the laser stripe into a quadratic curve equation, and performing matrixing processing to obtain a matrix F: f is YB, wherein:
Figure FDA0003521526550000061
obtaining a matrix B: b ═ YTY)-1YTF (4);
Determining the light intensity distribution function of the laser stripes:
Figure FDA0003521526550000062
wherein A is the gray value of the laser stripe, y is the y coordinate of the rectangular coordinate system on the image plane,
Figure FDA0003521526550000063
the y-axis coordinate value of the light intensity distribution center in the rectangular coordinate system is shown, and sigma is the width of the laser stripe intensity distribution;
taking logarithm operation on the light intensity distribution function:
Figure FDA0003521526550000064
order: f (y) lni (y),
Figure FDA0003521526550000065
the logarithmic expression of the light intensity distribution function is rewritten as:
F(y)=a0+a1y+a2y2 (7);
establishing an objective function:
Figure FDA0003521526550000066
wherein 2N +1 represents the number of strongly distributed data points; order to
Figure FDA0003521526550000067
The matrix equation can be derived:
Figure FDA0003521526550000071
obtaining the position of the center point of the laser stripe by a Householder conversion method:
Figure FDA0003521526550000072
determining the time domain response of continuous points on the bridge surface in the length range of the laser line projected on the bridge surface according to the laser line displacement time-course curve; the method specifically comprises the following steps:
determining a coordinate system, and determining an initial coordinate A of a central point of a laser stripe after a laser line is projected on the surface of the bridge and an initial coordinate B of a corresponding point of the central point of the laser stripe on an image surface; recording the coordinate position C of the installation point of the camera;
after the bridge surface is deformed, determining the coordinate A 'of the displaced central point of the laser stripe and the coordinate B' of the corresponding point on the image surface,
in the coordinate system, the crossing point B is taken as a parallel line of the AA 'connection line, the intersection point of the parallel line and the connection line of the camera coordinate C and the coordinate a' of the center point of the laser stripe is Bi, and the distance between the coordinate B and the intersection point Bi is:
Figure FDA0003521526550000073
a is the distance between AA',
coordinates B and B' define a straight line expressed as:
Figure FDA0003521526550000074
wherein alpha is an included angle between a straight line BB' and a straight line BBi;
the straight line passing through point C and point A' is represented as:
Figure FDA0003521526550000081
wherein, B is the distance between the coordinate origin O and the point B, and C is the distance between the coordinate origin and the camera position coordinate C;
and (12) and (13) are combined to determine the coordinates of B':
Figure FDA0003521526550000082
calculate the length d of line segment BB':
Figure FDA0003521526550000083
substituting equation (11) into equation (15):
Figure FDA0003521526550000084
the time-course vibration curve of the laser stripe is obtained based on equation (16).
CN202011512922.2A 2020-12-20 2020-12-20 Bridge modal identification method and system based on laser stripe center tracking Active CN112906095B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011512922.2A CN112906095B (en) 2020-12-20 2020-12-20 Bridge modal identification method and system based on laser stripe center tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011512922.2A CN112906095B (en) 2020-12-20 2020-12-20 Bridge modal identification method and system based on laser stripe center tracking

Publications (2)

Publication Number Publication Date
CN112906095A CN112906095A (en) 2021-06-04
CN112906095B true CN112906095B (en) 2022-04-08

Family

ID=76111655

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011512922.2A Active CN112906095B (en) 2020-12-20 2020-12-20 Bridge modal identification method and system based on laser stripe center tracking

Country Status (1)

Country Link
CN (1) CN112906095B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113340405B (en) * 2021-07-09 2023-03-17 中铁七局集团有限公司 Bridge vibration mode measuring method, device and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201740635U (en) * 2010-06-13 2011-02-09 重庆交通大学 Multi-spot deflection measurement device based on line laser
CN104657587A (en) * 2015-01-08 2015-05-27 华中科技大学 Method for extracting center line of laser stripe
CN107588913A (en) * 2017-08-03 2018-01-16 长安大学 A kind of deflection of bridge span detecting system and detection method
CN108286948A (en) * 2017-01-09 2018-07-17 南京理工大学 A kind of deflection of bridge span detection method based on image procossing
CN108921864A (en) * 2018-06-22 2018-11-30 广东工业大学 A kind of Light stripes center extraction method and device
CN110147781A (en) * 2019-05-29 2019-08-20 重庆交通大学 Bridge vibration mode based on machine learning visualizes damnification recognition method
CN110487197A (en) * 2019-09-03 2019-11-22 厦门大学嘉庚学院 A kind of bridge dynamic degree of disturbing real-time monitoring system applied based on ccd image sensor and laser

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8577628B2 (en) * 2009-04-10 2013-11-05 University Of South Carolina System and method for modal identification using smart mobile sensors
WO2015195728A1 (en) * 2014-06-17 2015-12-23 Drexel University Self-contained rapid modal testing system for highway bridges
CN109682561B (en) * 2019-02-19 2020-06-16 大连理工大学 Method for automatically detecting free vibration response of high-speed railway bridge to identify mode
CN110285770B (en) * 2019-07-31 2020-08-07 中山大学 Bridge deflection change measuring method, device and equipment
CN111272366B (en) * 2020-03-02 2021-12-07 东南大学 Bridge displacement high-precision measurement method based on multi-sensor data fusion

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201740635U (en) * 2010-06-13 2011-02-09 重庆交通大学 Multi-spot deflection measurement device based on line laser
CN104657587A (en) * 2015-01-08 2015-05-27 华中科技大学 Method for extracting center line of laser stripe
CN108286948A (en) * 2017-01-09 2018-07-17 南京理工大学 A kind of deflection of bridge span detection method based on image procossing
CN107588913A (en) * 2017-08-03 2018-01-16 长安大学 A kind of deflection of bridge span detecting system and detection method
CN108921864A (en) * 2018-06-22 2018-11-30 广东工业大学 A kind of Light stripes center extraction method and device
CN110147781A (en) * 2019-05-29 2019-08-20 重庆交通大学 Bridge vibration mode based on machine learning visualizes damnification recognition method
CN110487197A (en) * 2019-09-03 2019-11-22 厦门大学嘉庚学院 A kind of bridge dynamic degree of disturbing real-time monitoring system applied based on ccd image sensor and laser

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种改进的高斯拟合法在光带中心提取中的应用;孙盼庆等;《电子设计工程》;20120705;第179-181+185页 *
基于激光和视频的桥梁挠度测量新系统;蓝章礼等;《仪器仪表学报》;20091115;第2405-2410页 *
基于激光检测的老旧桥梁破损程度评估;徐健等;《激光杂志》;20170525;第50-53页 *

Also Published As

Publication number Publication date
CN112906095A (en) 2021-06-04

Similar Documents

Publication Publication Date Title
Chen et al. High-accuracy multi-camera reconstruction enhanced by adaptive point cloud correction algorithm
CN103471910B (en) A kind of elongation at break of metal material intelligent test method followed the tracks of based on random point
Fukuda et al. Vision-based displacement sensor for monitoring dynamic response using robust object search algorithm
CN109341903B (en) Inhaul cable force measuring method based on edge recognition in computer vision
Du et al. Dynamic measurement of stay-cable force using digital image techniques
CN111272366B (en) Bridge displacement high-precision measurement method based on multi-sensor data fusion
Jurjo et al. Experimental methodology for the dynamic analysis of slender structures based on digital image processing techniques
CN111174961B (en) Cable force optical measurement method based on modal analysis and measurement system thereof
CN103575227A (en) Vision extensometer implementation method based on digital speckles
EP2580557A1 (en) System and method for determining the position and orientation of a 3d feature
CN107817044B (en) Device and method for measuring plate vibration based on machine vision
CN102788572A (en) Method, device and system for measuring attitude of lifting hook of engineering machinery
CN113554697A (en) Cabin section profile accurate measurement method based on line laser
CN112906095B (en) Bridge modal identification method and system based on laser stripe center tracking
CN115790387A (en) Bridge displacement corner synchronous real-time monitoring method and system based on online camera
CN114066985B (en) Method for calculating hidden danger distance of power transmission line and terminal
CN111156921A (en) Contour data processing method based on sliding window mean filtering
CN110969601B (en) Structure rotation response non-contact identification method based on visual characteristic tracking algorithm
Su et al. Feature-constrained real-time simultaneous monitoring of monocular vision odometry for bridge bearing displacement and rotation
CN115542338B (en) Laser radar data learning method based on point cloud spatial distribution mapping
CN115761487A (en) Method for quickly identifying vibration characteristics of small and medium-span bridges based on machine vision
CN114219768A (en) Method, device, equipment and medium for measuring inhaul cable force based on pixel sensor
CN110532725B (en) Engineering structure mechanical parameter identification method and system based on digital image
CN115690150A (en) Video-based multi-target displacement tracking monitoring method and device
CN117029711B (en) Full-bridge strain response reconstruction method based on machine vision and optical fiber sensing technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230609

Address after: No. 100 Chayuan Road, Simao District, Pu'er City, Yunnan Province, 665099

Patentee after: Yunnan Jiaotou Pulan Expressway Co.,Ltd.

Patentee after: CHONGQING JIAOTONG University

Address before: 402247 No. 1 Fuxing Road, Shuang Fu New District, Jiangjin District, Chongqing.

Patentee before: CHONGQING JIAOTONG University

TR01 Transfer of patent right