CN112525326A - Computer vision measurement method for three-dimensional vibration of unmarked structure - Google Patents

Computer vision measurement method for three-dimensional vibration of unmarked structure Download PDF

Info

Publication number
CN112525326A
CN112525326A CN202011316519.2A CN202011316519A CN112525326A CN 112525326 A CN112525326 A CN 112525326A CN 202011316519 A CN202011316519 A CN 202011316519A CN 112525326 A CN112525326 A CN 112525326A
Authority
CN
China
Prior art keywords
virtual feature
pixel
feature point
image
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011316519.2A
Other languages
Chinese (zh)
Inventor
徐自力
辛存
王存俊
李康迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN202011316519.2A priority Critical patent/CN112525326A/en
Publication of CN112525326A publication Critical patent/CN112525326A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01HMEASUREMENT OF MECHANICAL VIBRATIONS OR ULTRASONIC, SONIC OR INFRASONIC WAVES
    • G01H9/00Measuring mechanical vibrations or ultrasonic, sonic or infrasonic waves by using radiation-sensitive means, e.g. optical means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)

Abstract

A computer vision method for three-dimensional vibration unmarked measurement of a structure comprises the following steps: 1) utilizing a binocular camera to record the vibration of the structure under different visual angles; 2) performing multi-scale difference space characterization on the initial frame image, and respectively screening out structural virtual feature points of the initial frame at different viewing angles; 3) selecting a pixel area with the virtual feature point as a center, and respectively constructing descriptor vectors of the virtual feature point under different visual angles of an initial frame based on the characteristics of image pixel brightness, gradient and the like; 4) performing Euclidean distance calculation between the obtained descriptor vectors to obtain pixel coordinates of the same virtual feature point under different viewing angles; 5) calculating pixel coordinates of the same virtual feature point under the same view angle frame by frame on the basis of the obtained pixel coordinates; 6) and constructing a mapping relation between a camera coordinate system and a space control system, and acquiring three-dimensional vibration information of the structure. The measuring device is simple to install, does not need to mark the structure, and is suitable for three-dimensional vibration measurement of the structure under various scenes.

Description

Computer vision measurement method for three-dimensional vibration of unmarked structure
Technical Field
The invention belongs to the technical field related to structural vibration measurement, and particularly relates to a computer vision measurement method for three-dimensional vibration of a non-marking structure.
Background
The three-dimensional vibration information of the structure can reflect the real motion of the structure and is an important means for describing the detailed information of the motion of the structure, so that the development of an efficient and rapid three-dimensional vibration measurement method has great significance. The existing three-dimensional vibration measurement method based on computer vision has the advantages of non-contact, high precision and the like, and is widely applied to the three-dimensional digital image correlation technology, the three-dimensional point tracking technology and the like. However, when measuring three-dimensional vibration of a structure, the currently adopted visual method needs to rely on characteristics of manual marks, such as speckles, geometric marks and the like, but in the measuring process, the marks are easy to fall off, which brings inconvenience to measurement. Meanwhile, for large structures and structures in service in severe environments, the method is high in operation cost, and a large amount of manpower and material resources are needed for single marking, so that great challenges are brought to practical engineering application.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide a computer vision measurement method for three-dimensional vibration of a non-marking structure, so as to solve the problem that the three-dimensional vibration measurement technology of the existing structure depends on artificial marking and is difficult to meet the application of practical engineering.
In order to achieve the purpose, the invention adopts the technical scheme that:
a computer vision measurement method for three-dimensional vibration of a mark-free structure comprises the following steps:
step 1), carrying out video recording on vibration of a structure under different visual angles by using a binocular camera;
step 2), carrying out multi-scale difference space representation on the initial frame image, and respectively screening out structural virtual characteristic points of the initial frame under different viewing angles by comparing brightness information of pixel points on different scales;
step 3), selecting a pixel area with the virtual feature point as a center, and respectively constructing descriptor vectors of the virtual feature point under different visual angles of an initial frame based on the brightness and the gradient of the image pixel;
step 4), Euclidean distance calculation is carried out among the descriptor vectors of the virtual characteristic points under different visual angles, and pixel coordinates of the same virtual characteristic point of an initial frame under different visual angles are obtained;
step 5), based on the pixel coordinates of the same virtual feature point in the initial frame under different viewing angles, adopting an optical flow algorithm to respectively calculate the pixel coordinates of the same virtual feature point under different viewing angles;
step 6), constructing a mapping relation between a camera coordinate system and a space control system by using conditions such as parallax, base moment and focal length of a binocular camera, and acquiring three-dimensional space coordinates of the virtual feature points at different moments based on pixel coordinates of the virtual feature points at different moments and under different viewing angles;
and 7), measuring the three-dimensional vibration of the structure based on the three-dimensional space coordinates of the virtual feature points at different moments.
Compared with the prior art, the invention has the beneficial effects that:
1) the surface of the structure to be detected is not required to be pretreated, and the application range is wider.
2) The method has good stability to noise such as image scale, rotation, illumination change and the like, and the measurement precision is higher.
Drawings
Fig. 1 is a schematic flow chart of a three-dimensional vibration unmarked measurement method of a structure according to the present invention.
Fig. 2 is a schematic diagram of an image multi-scale difference space.
Fig. 3 is a schematic diagram of a detection principle based on image difference spatial feature points.
FIG. 4 is a schematic diagram of a feature point descriptor vector based on pixel intensity gradients.
FIG. 5 is a schematic flow chart of computing feature point spatial position information frame by optical flow algorithm.
Fig. 6 is a schematic view of the geometric relationship of the binocular high-speed camera to measure the three-dimensional vibration of the structure.
Detailed Description
The embodiments of the present invention will be described in detail below with reference to the drawings and examples.
The invention provides a computer vision measurement method for three-dimensional vibration of a label-free structure, which carries out label-free measurement on the three-dimensional vibration of the structure through the steps of screening virtual feature points on the surface of the structure, constructing descriptor vectors of the virtual feature points, matching the virtual feature points under different visual angles, synthesizing three-dimensional coordinates of a space of the virtual feature points and the like, and the principle of the method is shown in figure 1, and the method comprises the following specific steps:
step 1: and carrying out video recording on the vibration of the structure under different visual angles by using a binocular camera.
Step 2: as shown in fig. 2, in order to detect stable virtual feature points in a video image, the present invention uses a gaussian kernel function with different scales to perform convolution with an initial frame image to obtain images with different scales, that is:
Figure BDA0002791338430000032
in the formula: l (x, y, sigma) is the image after the convolution of Gaussian kernel function, G (x, y, sigma) is the Gaussian base function of scale sigma,
Figure BDA0002791338430000033
for convolution, I (x, y) represents image pixel luminance information.
Adopting images with different scales to construct a Gaussian difference space of the image, wherein the description is as follows:
D(x,y,σ)=L(x,y,(l+1)σ)-L(x,y,lσ)l=1,2,3,...,M+1 (2)
in the formula: d (x, y, σ) represents a difference image of scale σ, and l represents the number of layers of the established gaussian difference spatial image.
As shown in fig. 3, in the present invention, a virtual feature point of a structure is obtained based on pixel extreme value detection in an image multi-scale difference space, in the process of detecting the extreme value point, the brightness of a pixel point is compared with the brightness of 26 adjacent pixel points in the scale space, and by comparing the gray values of the pixel point and the pixel points in the neighborhood, the brightness value of the pixel point is judged to be greater than or less than the adjacent points of the pixel point in the scale space, so as to screen out the virtual feature point on the surface of the structure. The 26 pixels comprise 8 pixel points in the self scale layer and 18 pixel points in the upper and lower scale layers respectively.
And step 3: as shown in fig. 4, a 16 × 16 pixel region centered on a virtual feature point is selected, the magnitude and direction of the gradient of the brightness of each pixel in the region are calculated, the gradient directions of the pixels on each 4 × 4 patch are counted in 8 directions, such as 0 °, 45 °, 90 °, 135 °, 180 °, 225 °, 270 °, 315 °, and the like, and the magnitude of the value in each direction is accumulated by the magnitude of the gradient of the brightness of the pixels in the region. Wherein the pixel brightness gradient magnitude and direction are expressed as:
Figure BDA0002791338430000031
θ(x,y)=arctan(Gy/Gx) (4)
in the formula: gx(x, y) and Gy(x, y) is the gradient of the pixel brightness at the (x, y) position in the x and y directions, respectively, and is calculated as follows:
Figure BDA00027913384300000415
in the formula: i (x, y) is the image pixel brightness,
Figure BDA00027913384300000416
representing a convolution operation, HxAnd HyGradient operators in x and y directions, respectively, are expressed as follows:
Figure BDA0002791338430000041
a seed point is generated in each pixel area of 4 × 4 small blocks, each virtual feature point is composed of 4 × 4 seed points, each seed point has vector information in 8 directions, and information of 16 × 8 ═ 128 feature points is generated in total as a descriptor vector of one virtual feature point.
And 4, step 4:assuming that two images of the initial frame under different viewing angles are m and n respectively, the vector sets of the structural virtual feature point descriptors created in step 3 are m and n respectively
Figure BDA0002791338430000042
And
Figure BDA0002791338430000043
and s is the number of the virtual feature points screened out from the m and n images.
In the image m, a feature point i is selected, the pixel coordinate of which is
Figure BDA0002791338430000044
The descriptor vector is
Figure BDA0002791338430000045
Respectively calculate
Figure BDA0002791338430000046
And
Figure BDA0002791338430000047
the Euclidean distance between the two virtual characteristic points, the pixel coordinate of the virtual characteristic point corresponding to the minimum distance descriptor vector is obtained and recorded as
Figure BDA0002791338430000048
At this time, the virtual feature points in the image m
Figure BDA0002791338430000049
Corresponding to virtual feature points in the image n
Figure BDA00027913384300000410
And 5: for a single virtual feature point k, based on the optical flow algorithm assumption, an optical flow equation is established as follows:
Figure BDA00027913384300000411
in the formula:
Figure BDA00027913384300000412
and
Figure BDA00027913384300000413
respectively expressed as the gradients of the intensities of the virtual feature point k in both the x and y directions,
Figure BDA00027913384300000414
u and v represent the optical flows of the virtual feature point k in the x and y directions, which are the derivatives of the image brightness with respect to time.
Since the brightness value of each point in the image at any moment can be acquired by the shooting equipment, the gradient of each virtual feature point in the x and y directions can be directly solved by the brightness of the image pixel for known quantity, but the equation has two unknown quantities u and v, and cannot be directly solved. Therefore, the assumption of motion similarity in the pixel neighborhood is introduced, that is, the virtual feature point k and its adjacent n pixels have similar motion, which satisfies the following conditions:
Figure BDA0002791338430000051
for simplicity, the above formula may be denoted as Akdk=bkWherein, in the step (A),
Figure BDA0002791338430000052
Figure BDA0002791338430000053
in the present invention, the above formula is solved by using the least square method, i.e., | A is satisfiedksk-bk|2Is the solution at the minimum, the optical flow of the virtual feature point k is therefore:
dk=Ak-1bk (9)
according to the calculated optical flows of the virtual feature points between two adjacent frames, the pixel coordinates of the virtual feature points in different frame images can be described as:
Figure BDA0002791338430000054
in the formula:
Figure BDA0002791338430000055
the horizontal pixel coordinates of the virtual feature point k at the time t, t + delta t,
Figure BDA0002791338430000056
and (3) the vertical pixel coordinates of the virtual feature point k at the time t, t + delta t.
Finally, the pixel coordinates of the feature points under the same view angle in different frames can be obtained through the step, and the calculation flow is shown in fig. 5.
Step 6: as shown in fig. 6, let the pixel coordinates of the structural virtual feature point k at different viewing angles at time t be
Figure BDA0002791338430000061
And
Figure BDA0002791338430000062
let the space coordinate of the virtual feature point k at the time t be
Figure BDA0002791338430000063
The two phase machines being fixed in the same plane, i.e.
Figure BDA0002791338430000064
According to the triangular relation between the measured structure and the binocular camera, the following can be obtained:
Figure BDA0002791338430000065
in the formula: f represents the focal length of the camera, B represents the base moment between the cameras,
Figure BDA0002791338430000066
image of camera representing virtual feature of left camera at time tThe coordinates of the elements are calculated,
Figure BDA0002791338430000067
pixel coordinates representing the virtual feature of the right camera at time t.
Therefore, the spatial coordinates of the virtual feature point k are expressed as:
Figure BDA0002791338430000068
in the formula: disparity represents the parallax between cameras, which is equal to
Figure BDA0002791338430000069
Finally, through the steps, the three-dimensional space coordinates of the structural virtual feature points in different frames can be obtained.
And 7: and 6, calculating the space motion of the structure by using the space three-dimensional coordinates of the structure virtual feature points at different moments, namely:
Figure BDA00027913384300000610
in the formula: dtRepresenting the displacement of the structure at time t.

Claims (7)

1. A computer vision measurement method for three-dimensional vibration of a mark-free structure is characterized by comprising the following steps:
step 1), carrying out video recording on vibration of a structure under different visual angles by using a binocular camera;
step 2), carrying out multi-scale difference space representation on the initial frame image, and respectively screening out structural virtual characteristic points of the initial frame under different viewing angles by comparing brightness information of pixel points on different scales;
step 3), selecting a pixel area with the virtual feature point as a center, and respectively constructing descriptor vectors of the virtual feature point under different visual angles of an initial frame based on the brightness and the gradient of the image pixel;
step 4), Euclidean distance calculation is carried out among the descriptor vectors of the virtual characteristic points under different visual angles, and pixel coordinates of the same virtual characteristic point of an initial frame under different visual angles are obtained;
step 5), based on the pixel coordinates of the same virtual feature point in the initial frame under different viewing angles, adopting an optical flow algorithm to respectively calculate the pixel coordinates of the same virtual feature point under different viewing angles;
step 6), constructing a mapping relation between a camera coordinate system and a space control system, and acquiring three-dimensional space coordinates of the virtual feature points at different moments based on pixel coordinates of the virtual feature points at different moments and at different viewing angles;
and 7), measuring the three-dimensional vibration of the structure based on the three-dimensional space coordinates of the virtual feature points at different moments.
2. The computer vision measurement method for three-dimensional vibration of a label-free structure according to claim 1, wherein in the step 2), different scale gaussian kernel functions are used to perform convolution with the initial frame image to obtain different scale images, that is:
Figure FDA0002791338420000011
in the formula: l (x, y, sigma) is the image after the convolution of Gaussian kernel function, G (x, y, sigma) is the Gaussian base function of scale sigma,
Figure FDA0002791338420000012
for convolution operation, I (x, y) represents image pixel brightness information;
the multi-scale difference space of the image is constructed by utilizing the images with different scales, and the description is as follows:
D(x,y,σ)=L(x,y,(l+1)σ)-L(x,y,lσ)
in the formula: d (x, y, σ) represents a difference image with a scale σ, l represents the number of layers for creating a gaussian difference spatial image, and l is 1, 2, 3;
on the basis of establishing a Gaussian multi-scale difference space, the brightness of an image pixel is compared with the brightness of 26 pixels adjacent to the scale space to judge whether the brightness value of the image pixel is larger than or smaller than the adjacent point of the point in a scale domain, an extreme point of the brightness of the image pixel is detected, and a virtual feature point of the structure surface is screened out, wherein the 26 pixels comprise 8 pixel points in a self scale layer and 18 pixel points respectively positioned in an upper scale layer and a lower scale layer.
3. The computer vision measuring method for three-dimensional vibration of a label-free structure as claimed in claim 1, wherein in the step 3), a 16 x 16 pixel area with a virtual feature point as a center is selected, the brightness gradient size and direction of each pixel in the area are calculated, and the gradient direction of the pixel on each 4 x 4 small block is set to 00,450,900,1350,1800,2250,2700,3150Counting in 8 directions, and accumulating the magnitude of the value in each direction by the magnitude of the pixel brightness gradient in the region, wherein the magnitude g (x, y) of the pixel brightness gradient and the direction theta (x, y) are expressed as:
Figure FDA0002791338420000021
θ(x,y)=arctan(Gy(x,y)/Gx(x,y))
in the formula: gx(x, y) and Gy(x, y) is the gradient of the pixel brightness at the (x, y) position in the x and y directions, respectively, and is calculated as follows:
Figure FDA0002791338420000022
in the formula: i (x, y) is the image pixel brightness,
Figure FDA0002791338420000023
representing a convolution operation, HxAnd HyGradient operators in x and y directions, respectively, are expressed as follows:
Figure FDA0002791338420000024
generating a seed point in a pixel area on each 4 × 4 small block, wherein each virtual feature point is composed of 4 × 4 seed points, each seed point has vector information in 8 directions, and information of 16 × 8-128 feature points is generated as a descriptor vector of one virtual feature point;
and repeating the step 3) to obtain the descriptor vectors of the structural virtual feature points under different visual angles of the initial frame.
4. The computer vision measuring method for three-dimensional vibration of unmarked structure according to claim 3, wherein in step 4), the two images at different viewing angles of the initial frame are m and n, respectively, and the set of descriptor vectors of the virtual feature points of the structure created in step 3) is m and n, respectively
Figure FDA0002791338420000031
And
Figure FDA0002791338420000032
s is the number of the virtual feature points screened out from the m and n images;
in the image m, a feature point i is selected, the pixel coordinate of which is
Figure FDA0002791338420000033
The descriptor vector is
Figure FDA0002791338420000034
Respectively calculate
Figure FDA0002791338420000035
And
Figure FDA0002791338420000036
the Euclidean distance between the two descriptors is obtained, and the minimum distance descriptor vector corresponds to the two descriptorsPixel coordinates of the virtual feature points of (2), noted
Figure FDA0002791338420000037
At this time, the virtual feature points in the image m
Figure FDA0002791338420000038
Corresponding to virtual feature points in the image n
Figure FDA0002791338420000039
5. The computer vision measurement method for three-dimensional vibration of a label-free structure according to claim 4, wherein in the step 5), for a single virtual feature point k, an optical flow equation is established according to an optical flow algorithm hypothesis as follows:
Figure FDA00027913384200000310
in the formula:
Figure FDA00027913384200000311
and
Figure FDA00027913384200000312
respectively expressed as the gradients of the intensities of the virtual feature point k in both the x and y directions,
Figure FDA00027913384200000313
u and v represent optical flows of the virtual feature point k in x and y directions respectively, wherein the derivative of image brightness with respect to time is shown;
the virtual feature point k and the adjacent n pixels have similar motion, and the following conditions are satisfied:
Figure FDA00027913384200000314
solving the above formula by adopting a least square method to obtain the optical flow of the virtual feature point k, wherein according to the calculated optical flow of the virtual feature point between two adjacent frames, the pixel coordinates of the virtual feature point k in different frame images are described as follows:
Figure FDA0002791338420000041
in the formula:
Figure FDA0002791338420000042
respectively represents the horizontal pixel coordinates of the virtual characteristic point k at the time t and t + delta t,
Figure FDA0002791338420000043
Figure FDA0002791338420000044
respectively representing the vertical pixel coordinates of the virtual feature point k at the time t and the time t + delta t;
finally, the pixel coordinates of the virtual feature points under different frames at the same view angle are obtained through the step.
6. The computer vision measuring method for three-dimensional vibration of unmarked structure according to claim 5, wherein in the step 6), the pixel coordinates of the virtual feature point k on the structure at different viewing angles at the time t are
Figure FDA0002791338420000045
And
Figure FDA0002791338420000046
the space coordinate of the virtual feature point k at the moment t is
Figure FDA0002791338420000047
The two phase machines being fixed in the same plane, i.e.
Figure FDA0002791338420000048
According to the triangular relation between the measured structure and the binocular camera, the following can be obtained:
Figure FDA0002791338420000049
in the formula: f represents the focal length of the camera, B represents the base moment between the cameras,
Figure FDA00027913384200000410
the camera represents the pixel coordinates of the left camera virtual feature at time t,
Figure FDA00027913384200000411
pixel coordinates representing the virtual features of the right camera at time t;
the spatial coordinates of the virtual feature point k are then expressed as:
Figure FDA00027913384200000412
in the formula: disparity represents the parallax between cameras, which is equal to
Figure FDA00027913384200000413
Finally, the three-dimensional space coordinates of the structural virtual feature points in different frames are obtained through the step.
7. The computer vision measuring method for three-dimensional vibration of a label-free structure according to claim 6, wherein in the step 7), the spatial three-dimensional coordinates of the virtual feature points of the structure at different time points calculated in the step 6) are used to calculate the spatial motion of the structure, that is:
Figure FDA0002791338420000051
in the formula: dtRepresentative knotConstituting the displacement at time t.
CN202011316519.2A 2020-11-21 2020-11-21 Computer vision measurement method for three-dimensional vibration of unmarked structure Pending CN112525326A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011316519.2A CN112525326A (en) 2020-11-21 2020-11-21 Computer vision measurement method for three-dimensional vibration of unmarked structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011316519.2A CN112525326A (en) 2020-11-21 2020-11-21 Computer vision measurement method for three-dimensional vibration of unmarked structure

Publications (1)

Publication Number Publication Date
CN112525326A true CN112525326A (en) 2021-03-19

Family

ID=74982198

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011316519.2A Pending CN112525326A (en) 2020-11-21 2020-11-21 Computer vision measurement method for three-dimensional vibration of unmarked structure

Country Status (1)

Country Link
CN (1) CN112525326A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113963013A (en) * 2021-10-22 2022-01-21 石家庄铁道大学 Markless power transmission tower displacement vibration identification method based on computer vision
CN113781522B (en) * 2021-08-25 2023-10-24 西安交通大学 Method for measuring gun barrel vibration under shooting working condition based on computer vision

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102176243A (en) * 2010-12-30 2011-09-07 浙江理工大学 Target ranging method based on visible light and infrared camera
CN102853902A (en) * 2012-09-06 2013-01-02 西安交通大学 Method for noncontact measurement of boundary vibration and application of method
CN105488816A (en) * 2015-11-27 2016-04-13 中南大学 On-line detection device and method of mineral flotation froth flow velocity on the basis of three-dimensional visual information
CN109883533A (en) * 2019-01-21 2019-06-14 哈尔滨工业大学(深圳) Low frequency vibration measurement method based on machine vision
CN110047110A (en) * 2019-03-11 2019-07-23 北京空间飞行器总体设计部 A kind of in-orbit vibration measurement method of flexible satellite antenna based on sequence image
CN110111390A (en) * 2019-05-15 2019-08-09 湖南科技大学 Thin-wall part omnidirectional vibration measurement method and system based on binocular vision optical flow tracking
US20190273845A1 (en) * 2018-03-05 2019-09-05 Prüftechnik Dieter Busch AG Vibration monitoring of an object using a video camera
CN210603573U (en) * 2019-04-24 2020-05-22 华南理工大学 Flexible plate rotary motion vibration detection device
CN111210463A (en) * 2020-01-15 2020-05-29 上海交通大学 Virtual wide-view visual odometer method and system based on feature point auxiliary matching

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102176243A (en) * 2010-12-30 2011-09-07 浙江理工大学 Target ranging method based on visible light and infrared camera
CN102853902A (en) * 2012-09-06 2013-01-02 西安交通大学 Method for noncontact measurement of boundary vibration and application of method
CN105488816A (en) * 2015-11-27 2016-04-13 中南大学 On-line detection device and method of mineral flotation froth flow velocity on the basis of three-dimensional visual information
US20190273845A1 (en) * 2018-03-05 2019-09-05 Prüftechnik Dieter Busch AG Vibration monitoring of an object using a video camera
CN109883533A (en) * 2019-01-21 2019-06-14 哈尔滨工业大学(深圳) Low frequency vibration measurement method based on machine vision
CN110047110A (en) * 2019-03-11 2019-07-23 北京空间飞行器总体设计部 A kind of in-orbit vibration measurement method of flexible satellite antenna based on sequence image
CN210603573U (en) * 2019-04-24 2020-05-22 华南理工大学 Flexible plate rotary motion vibration detection device
CN110111390A (en) * 2019-05-15 2019-08-09 湖南科技大学 Thin-wall part omnidirectional vibration measurement method and system based on binocular vision optical flow tracking
CN111210463A (en) * 2020-01-15 2020-05-29 上海交通大学 Virtual wide-view visual odometer method and system based on feature point auxiliary matching

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘峰伯: "基于OpenCV的双目立体视觉测距测速的研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781522B (en) * 2021-08-25 2023-10-24 西安交通大学 Method for measuring gun barrel vibration under shooting working condition based on computer vision
CN113963013A (en) * 2021-10-22 2022-01-21 石家庄铁道大学 Markless power transmission tower displacement vibration identification method based on computer vision

Similar Documents

Publication Publication Date Title
CN110473260B (en) Wave video measuring device and method
CN109559348A (en) A kind of contactless deformation measurement method of bridge based on tracing characteristic points
CN110956661B (en) Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix
CN112254656B (en) Stereoscopic vision three-dimensional displacement measurement method based on structural surface point characteristics
CN110146030A (en) Side slope surface DEFORMATION MONITORING SYSTEM and method based on gridiron pattern notation
EP3857874B1 (en) Hybrid depth processing
CN112525326A (en) Computer vision measurement method for three-dimensional vibration of unmarked structure
CN112967312B (en) Real-time robust displacement monitoring method and system for field rigid body target
CN109341668A (en) Polyphaser measurement method based on refraction projection model and beam ray tracing method
CN108362205A (en) Space ranging method based on fringe projection
CN112595236A (en) Measuring device for underwater laser three-dimensional scanning and real-time distance measurement
CN113223050A (en) Robot motion track real-time acquisition method based on Aruco code
CN110120012B (en) Video stitching method for synchronous key frame extraction based on binocular camera
CN115717867A (en) Bridge deformation measurement method based on airborne double cameras and target tracking
CN105957060B (en) A kind of TVS event cluster-dividing method based on optical flow analysis
Le et al. System to measure three-dimensional movements in physical models
CN112712566B (en) Binocular stereo vision sensor measuring method based on structure parameter online correction
CN112488022B (en) Method, device and system for monitoring panoramic view
CN113763444A (en) Phase-unfolding-free three-dimensional face reconstruction method and system for level line pairing
CN114119670A (en) Flow velocity measuring method for acquiring river video based on smart phone
Bandara et al. Frame feature tracking for speed estimation
CN117934636B (en) Dynamic external parameter calibration method and device for multi-depth camera
CN114529493A (en) Cable appearance defect detection and positioning method based on binocular vision
Peng et al. A Maglev gap measurement method based on machine vision
CN106324976B (en) Test macro and test method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination