CN107945166B - Binocular vision-based method for measuring three-dimensional vibration track of object to be measured - Google Patents

Binocular vision-based method for measuring three-dimensional vibration track of object to be measured Download PDF

Info

Publication number
CN107945166B
CN107945166B CN201711188795.3A CN201711188795A CN107945166B CN 107945166 B CN107945166 B CN 107945166B CN 201711188795 A CN201711188795 A CN 201711188795A CN 107945166 B CN107945166 B CN 107945166B
Authority
CN
China
Prior art keywords
frame
measured
pixel
coordinate
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711188795.3A
Other languages
Chinese (zh)
Other versions
CN107945166A (en
Inventor
胡琮亮
方明杰
饶文培
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
719th Research Institute of CSIC
Original Assignee
719th Research Institute of CSIC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 719th Research Institute of CSIC filed Critical 719th Research Institute of CSIC
Priority to CN201711188795.3A priority Critical patent/CN107945166B/en
Publication of CN107945166A publication Critical patent/CN107945166A/en
Application granted granted Critical
Publication of CN107945166B publication Critical patent/CN107945166B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01HMEASUREMENT OF MECHANICAL VIBRATIONS OR ULTRASONIC, SONIC OR INFRASONIC WAVES
    • G01H9/00Measuring mechanical vibrations or ultrasonic, sonic or infrasonic waves by using radiation-sensitive means, e.g. optical means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a binocular vision-based method for measuring a three-dimensional vibration track of an object to be measured, which comprises the following steps of: shooting an object to be detected by using two cameras to obtain video stream data; selecting one frame of data at the same time as a first frame from the video stream data of the two cameras, and obtaining the pixel coordinates of the object to be detected in the first frame; calculating the offset vector of the object to be detected between the second frame and the first frame through the pixel similarity, obtaining the matched pixel coordinates of the object to be detected in the second frame, and calculating the world coordinates of the object to be detected in the second frame by combining the pixel coordinates of the two cameras and the world coordinates; calculating the estimated pixel coordinate of the third frame and the corresponding world coordinate of the object to be detected in the third frame by using the matched pixel coordinate of the object to be detected in the second frame and the pixel similarity of the second frame and the third frame; and repeating the steps until the three-dimensional vibration track of the object to be measured in the measuring time period is obtained. The method is simple and has high measurement precision.

Description

Binocular vision-based method for measuring three-dimensional vibration track of object to be measured
Technical Field
The invention relates to the field of video monitoring, in particular to a binocular vision-based method for measuring a three-dimensional vibration track of an object to be measured.
Background
In order to reduce noise of the rotating equipment, the operating condition of the rotating equipment needs to be monitored to obtain vibration information of the rotating equipment, so that the noise of the rotating equipment during operation is reduced by adopting an active noise reduction method. The traditional method for monitoring the operation of the rotating equipment usually adopts a strain gauge type measuring method, which is mainly used in two dimensions or even one dimension, but cannot be realized in three dimensions. The three-dimensional vibration information of the rotating equipment can provide a large amount of operation data for workers to reduce noise of the rotating equipment. At present, the problem of obtaining three-dimensional vibration information of instrument equipment is difficult.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a binocular vision-based method for measuring the three-dimensional vibration track of an object to be measured, which can obtain the three-dimensional vibration track of the object to be measured, is simple and practical and has high measurement precision.
In order to achieve the above purposes, the technical scheme adopted by the invention is as follows: the binocular vision-based method for measuring the three-dimensional vibration track of the object to be measured comprises two cameras with different directions, wherein lens surfaces of the two cameras are positioned on the same plane, and central axes of the two cameras are parallel; which comprises the following steps:
shooting an object to be detected by using two cameras, and obtaining video stream data of the two cameras;
selecting one frame of data at the same time as a first frame from the video stream data of the two cameras, and obtaining the pixel coordinates of the object to be detected in the first frame;
calculating an offset vector of the object to be measured between the second frame and the first frame through the pixel similarity, obtaining an estimated pixel coordinate of the object to be measured in the second frame, determining the estimated pixel coordinate as a matched pixel coordinate of the object to be measured in the second frame if the estimated pixel coordinate meets the precision requirement, and calculating a world coordinate of the object to be measured in the second frame by combining the pixel coordinates and the world coordinates of the two cameras;
calculating the estimated pixel coordinate of the object to be measured in the third frame and the corresponding world coordinate of the object to be measured in the third frame by using the matched pixel coordinate of the object to be measured in the second frame and the pixel similarity of the second frame and the third frame;
and repeating the steps until the three-dimensional vibration track of the object to be measured in the measuring time period is obtained.
Further, the method for calculating the estimated pixel coordinates of the object to be measured in the second frame comprises the following steps:
a1: selecting the width and height of the two camera images as W and H respectively; the maximum width and height of the object to be measured are w and h respectively,the object to be measured in the first frame I0Has a pixel coordinate of (i)0,j0) The pixel distribution is A0
Figure BDA0001480602300000021
A2: searching the object to be detected in the second frame I, selecting the width and the height of a search box as w and h respectively, setting the pixel coordinate of the search box as (I, j), and setting the pixel distribution as A:
Figure BDA0001480602300000022
a3: calculating an offset vector:
Figure BDA0001480602300000031
wherein ω (i, j) is the pixel similarity, and
Figure BDA0001480602300000032
n is the number of times the search box is shifted when searching in the second frame, (x)n,yn) For the pixel coordinate of the search box after the nth shift in the second frame, the value range of (i, j) is distance (x)n,yn) More recent
Figure BDA0001480602300000033
Pixel coordinates of the individual pixel points; (i)n,jn) Is a distance (x)n,yn) More recent
Figure BDA0001480602300000034
The value range of the pixel coordinates of each pixel point;
a4: order to
Figure BDA0001480602300000035
The pixel coordinate of the object to be measured in the second frame is estimated to be (x'n,y’n)=m(xn,yn)=(xn,yn)+M(xn,yn)。
Further, the method for judging whether the estimated pixel coordinate is the matching pixel coordinate of the object to be measured in the second frame includes:
if M (x)nYn) | | less than or equal to epsilon or N > NmaxEstimating pixel coordinates as matched pixel coordinates of the object to be measured in the second frame;
if M (x)nYn) | > epsilon and N is less than or equal to NmaxThen return to step a 3;
wherein epsilon is a preset threshold value of the tracking step length of the object to be detected in the second frame, NmaxThe maximum number of offsets when searching in the second frame for the preset search box.
Further, before measurement, the internal parameter matrixes of the two cameras and the external parameter matrixes of the two cameras are respectively obtained.
Further, the external parameter matrix and the external parameter matrix obtaining method are a Zhang Zhengyou calibration method.
Further, the conversion formula of the pixel coordinate and the world coordinate is as follows:
Figure BDA0001480602300000041
wherein the content of the first and second substances,
Figure BDA0001480602300000042
is an internal parameter matrix, and the internal parameter matrix,
Figure BDA0001480602300000043
is an external parameter matrix, and (u, v) is the coordinate of the object to be measured in the pixel coordinate system; (u)0,v0) Is the center of the image plane; (x, y) is the coordinate of the object to be measured in the image plane coordinate; (X)W,YW,ZW) The coordinates of the object to be measured in the world coordinate system are obtained.
Compared with the prior art, the invention has the advantages that:
(1) the method provided by the invention has high measurement precision and simple method, the pixel similarity of the whole image and the object to be measured is calculated in each frame based on the pixel characteristics of the object to be measured in the image, the offset vector is calculated according to the pixel similarity, and the offset vector meeting the conditions is obtained through continuous iteration, so that the matched pixel coordinate of the object to be measured in each frame is determined. And determining the three-dimensional coordinates according to the matched pixel coordinates of the object to be measured in the two cameras.
(2) The measuring method of the invention belongs to a non-contact measuring method, and the camera can be arranged in a relatively closed space, so that the measuring method is suitable for vibrating objects working in severe environment; the durability is high, the camera is not contacted with a vibrating object, the working environment is relatively good, and the camera can be used for a long time.
(3) Compared with the traditional strain gauge type measurement, the measurement method provided by the invention can acquire the three-dimensional vibration information of the object to be measured, track the object to be measured, completely record the vibration track and carry out a large amount of vibration analysis.
Drawings
Fig. 1 is a flowchart of a measurement method according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
It should be noted that the "selected" and "preset" data are directly determined; the data of the "setting" is calculated to be determined.
Referring to fig. 1, an embodiment of the present invention provides a method for measuring a three-dimensional vibration trajectory of an object to be measured based on binocular vision, including two cameras with different orientations, lens surfaces of the two cameras being located on a same plane, and central axes being parallel; which comprises the following steps:
shooting an object to be detected by using two cameras, and obtaining video stream data of the two cameras;
selecting one frame of data at the same time as a first frame from the video stream data of the two cameras, and obtaining the pixel coordinates of the object to be detected in the first frame; the pixel coordinates in the first frame are selected and therefore determined directly as a basis for calculating the pixel coordinates of the object to be measured in the second frame.
Calculating an offset vector of the object to be measured between the second frame and the first frame through the pixel similarity, obtaining an estimated pixel coordinate of the object to be measured in the second frame, determining the estimated pixel coordinate as a matched pixel coordinate of the object to be measured in the second frame if the estimated pixel coordinate meets the precision requirement, and calculating a world coordinate of the object to be measured in the second frame by combining the pixel coordinates and the world coordinates of the two cameras;
calculating the estimated pixel coordinate of the object to be measured in the third frame and the corresponding world coordinate of the object to be measured in the third frame by using the matched pixel coordinate of the object to be measured in the second frame and the pixel similarity of the second frame and the third frame; that is, whether the estimated pixel coordinate of the object to be measured in the third frame meets the precision requirement is judged, and if so, the estimated pixel coordinate is determined to be the matched pixel coordinate of the object to be measured in the third frame.
Repeating the steps until a three-dimensional vibration track of the object to be measured in the measurement time period is obtained, namely, calculating the pixel coordinate of the object to be measured in the second frame according to the pixel coordinate of the object to be measured in the first frame, calculating the pixel coordinate of the object to be measured in the third frame according to the calculated pixel coordinate of the object to be measured in the second frame, calculating the pixel coordinate of the object to be measured in the fourth frame according to the calculated pixel coordinate of the object to be measured in the third frame, repeating the steps in the same way, calculating to finally obtain the pixel coordinate of the object to be measured in each frame in the measurement time period, and obtaining the world coordinate of the object to be measured in each frame in the measurement time period according to the conversion relation between the pixel coordinate of the camera and the world coordinate.
The principle of the invention is as follows: the method comprises the steps of calculating the coordinate of a matched pixel in a current frame through the coordinate of the matched pixel of an object to be measured in a previous frame, shooting the object to be measured by using two cameras, and calculating the world coordinate of the object to be measured at the moment by combining the pixel coordinate of the two cameras and the world coordinate of the object to be measured at the same time.
In the measurement, two cameras are simultaneously performed, and the measurement steps and calculation methods of the two cameras are the same, taking a first frame and a second frame of one camera as an example, wherein the pixel coordinates of the object to be measured in the first frame are selected, so that the parameters of the first frame are known, and the measurement is required from the second frame, however, for two consecutive frames to be measured, the pixel coordinates of the object to be measured in the next frame can be determined from the pixel coordinates of the previous frame, and the pixel coordinates of the object to be measured in the previous frame can be determined from the pixel coordinates of the previous frame, in short, the pixel coordinates of the second frame can be calculated from the pixel coordinates of the first frame, the pixel coordinates of the third frame can be calculated from the pixel coordinates of the second frame, and the pixel coordinates of the fourth frame can be calculated from the pixel coordinates of the third frame, repeating the steps to obtain the three-dimensional vibration trajectory of the object to be measured in the measurement time period, as shown in fig. 1, the specific steps of calculating the second frame from the first frame are as follows:
s1: acquiring an internal parameter matrix of a camera and an external parameter matrix of the camera; the method for acquiring the internal parameter matrix and the external parameter matrix is a common method, and preferably adopts a Zhangyingyou calibration method.
S2: selecting width and height of a camera image as W and H respectively; the maximum width and height of the object to be measured are w and h respectively, and the object to be measured is in a first frame I0Has a pixel coordinate of (i)0,j0) The pixel distribution is A0
Figure BDA0001480602300000071
Because the first frame is selected, the maximum width and height of the first frame, and the pixel coordinates and the pixel distribution of the object to be measured in the first frame are known; for convenience of description, the images of the two cameras may also be selected to be the same size, W wide and H high.
S3: searching the object to be detected in the second frame I, selecting the width and the height of a search box as w and h respectively, setting the pixel coordinate of the search box as (I, j), and setting the pixel distribution as A:
Figure BDA0001480602300000072
s4: calculating an offset vector:
Figure BDA0001480602300000081
wherein ω (i, j) is the pixel similarity, and
Figure BDA0001480602300000082
n is the number of times the search box is shifted when searching in the second frame, (x)n,yn) For the pixel coordinate of the search box after the nth shift in the second frame, the value range of (i, j) is distance (x)n,yn) More recent
Figure BDA0001480602300000083
Pixel coordinates of the individual pixel points; (in, jn) is a distance (x)nYn) most recent
Figure BDA0001480602300000084
The value range of the pixel coordinates of each pixel point;
order to
Figure BDA0001480602300000085
The pixel coordinate of the object to be measured in the second frame is estimated to be (x'n,y’n)=m(xn,yn)=(xn,yn)+M(xn,yn);
S5: and (3) judging: if M (x)n,yn) Less than or equal to epsilon or N is more than NmaxEstimating pixel coordinates as matched pixel coordinates of the object to be measured in the second frame;
if M (x)n,yn) N is less than or equal to N and | | > epsilonmaxThen return to step S4;
wherein epsilon is a preset threshold value of the tracking step length of the object to be detected in the second frame, NmaxThe maximum number of offsets when searching in the second frame for the preset search box.
And repeating the steps to calculate the coordinate of the matched pixel of the object to be measured in each frame of the two cameras.
Regarding the threshold of the tracking step, it is related to the pixel resolution of the image and the size of the object to be detected in the image, for example, the image resolution is 10000 × 10000, the size of the object to be detected in the image is 1000 × 1000, the threshold is set to 5, meaning that the target position moves by a distance of more than 5 pixels, the target movement is determined, and below 5 movements, the movement distance is considered as small enough to be ignored, and no movement is considered.
Setting NmaxThe meaning of (1) is that the program cannot loop all the time to find because the target cannot be found, and thus the program runs away.
According to the matched pixel coordinates of the object to be measured in the two cameras, combining the pixel coordinates and a world coordinate conversion formula, carrying out simultaneous solution on the equation set to obtain the world coordinates of the object to be measured in each frame; the conversion formula of the pixel coordinate and the world coordinate is as follows:
Figure BDA0001480602300000091
wherein the content of the first and second substances,
Figure BDA0001480602300000092
is an internal parameter matrix, and the internal parameter matrix,
Figure BDA0001480602300000093
is an extrinsic parameter matrix(u, v) is the coordinate of the object to be measured in the pixel coordinate system, namely the pixel coordinate; (u)0,v0) Is the center of the image plane; (x, y) is the coordinate of the object to be measured in the image plane coordinate; (X)C,YC,ZC) The coordinate of the object to be measured in the camera coordinate system; (X)W,YW,ZW) The coordinates of the object to be measured in the world coordinate system are obtained.
For example, the coordinate of the matching pixel of the object to be measured in the first camera is (u)1,v1) The coordinate of the matching pixel of the object to be measured in the second camera is (u)2,v2)。
For the first video camera to be used,
Figure BDA0001480602300000101
in the case of a second camera, the camera,
Figure BDA0001480602300000102
simultaneous solution of equation sets (X)W,YW,ZW)。
Respectively calculating pixel coordinates of an object to be measured in the two cameras at the same time, and when judging whether the estimated pixel coordinates are matched pixel coordinates or not, only when the estimated pixel coordinates of the object to be measured in the two cameras are both judged to be matched pixel coordinates, calculating the world coordinates at the moment, and entering the calculation in the next frame; otherwise, if the estimated pixel coordinate of the object to be measured in one of the cameras is judged to be the matched pixel coordinate, that is, the condition is satisfied, but the estimated pixel coordinate of the object to be measured in the other camera is judged to be the non-matched pixel coordinate, that is, the condition of the matched pixel coordinate is not satisfied, then the object satisfying the condition is calculated continuously, in the process, the position of the search frame of the object to be measured may not be changed or may be changed slightly until the estimated pixel coordinate of the object to be measured in the other camera is judged to be the matched pixel coordinate, and then the current world coordinate is calculated and the calculation in the next frame is performed.
The present invention is not limited to the above-described embodiments, and it will be apparent to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and such modifications and improvements are also considered to be within the scope of the present invention. Those not described in detail in this specification are within the skill of the art.

Claims (5)

1. The binocular vision-based method for measuring the three-dimensional vibration track of the object to be measured comprises two cameras with different directions, wherein lens surfaces of the two cameras are positioned on the same plane, and central axes of the two cameras are parallel; the method is characterized by comprising the following steps:
shooting an object to be detected by using two cameras, and obtaining video stream data of the two cameras;
selecting one frame of data at the same time as a first frame from the video stream data of the two cameras, and obtaining the pixel coordinates of the object to be detected in the first frame;
calculating an offset vector of the object to be measured between the second frame and the first frame through the pixel similarity, obtaining an estimated pixel coordinate of the object to be measured in the second frame, determining the estimated pixel coordinate as a matched pixel coordinate of the object to be measured in the second frame if the estimated pixel coordinate meets the precision requirement, and calculating a world coordinate of the object to be measured in the second frame by combining the pixel coordinates and the world coordinates of the two cameras;
calculating the estimated pixel coordinate of the object to be measured in the third frame and the corresponding world coordinate of the object to be measured in the third frame by using the matched pixel coordinate of the object to be measured in the second frame and the pixel similarity of the second frame and the third frame;
repeating the steps until a three-dimensional vibration track of the object to be measured in the measuring time period is obtained;
the method for calculating the estimated pixel coordinate of the object to be measured in the second frame comprises the following steps:
a1: selecting the width and height of the two camera images as W and H respectively; test objectThe maximum width and height of the body are w and h respectively, and the object to be measured is in the first frame I0Has a pixel coordinate of (i)0,j0) The pixel distribution is A0
Figure FDA0003124149230000011
A2: searching the object to be detected in the second frame I, selecting the width and the height of a search box as w and h respectively, setting the pixel coordinate of the search box as (I, j), and setting the pixel distribution as A:
Figure FDA0003124149230000021
a3: calculating an offset vector:
Figure FDA0003124149230000022
wherein ω (i, j) is the pixel similarity, and
Figure FDA0003124149230000023
n is the number of times the search box is shifted when searching in the second frame, (x)n,yn) For the pixel coordinate of the search box after the nth shift in the second frame, the value range of (i, j) is distance (x)n,yn) More recent
Figure FDA0003124149230000024
Pixel coordinates of the individual pixel points; (i)n,jn) Is a distance (x)n,yn) More recent
Figure FDA0003124149230000025
The value range of the pixel coordinates of each pixel point;
a4: order to
Figure FDA0003124149230000031
The pixel coordinate of the object to be measured in the second frame is estimated to be (x'n,y’n)=m(xn,yn)=(xn,yn)+M(xn,yn)。
2. The measurement method according to claim 1, wherein the method of determining whether the estimated pixel coordinate is a matching pixel coordinate of the object to be measured in the second frame comprises:
if M (x)n,yn) Less than or equal to epsilon or N is more than NmaxEstimating pixel coordinates as matched pixel coordinates of the object to be measured in the second frame;
if M (x)n,yn) N is less than or equal to N and | | > epsilonmaxThen return to step a 3;
wherein epsilon is a preset threshold value of the tracking step length of the object to be detected in the second frame, NmaxThe maximum number of offsets when searching in the second frame for the preset search box.
3. The measurement method according to claim 1, characterized in that: before measurement, internal parameter matrixes of the two cameras and external parameter matrixes of the two cameras are obtained respectively.
4. A measuring method according to claim 3, characterized in that: the external parameter matrix and the external parameter matrix obtaining method are Zhangyingyou calibration method.
5. The measurement method of claim 1, wherein the pixel coordinate and world coordinate conversion formula is:
Figure FDA0003124149230000032
wherein the content of the first and second substances,
Figure FDA0003124149230000041
is an internal parameter matrix, and the internal parameter matrix,
Figure FDA0003124149230000042
is an external parameter matrix, and (u, v) is the coordinate of the object to be measured in the pixel coordinate system; (u)0,v0) Is the center of the image plane; (x, y) is the coordinate of the object to be measured in the image plane coordinate; (X)W,YW,ZW) The coordinates of the object to be measured in the world coordinate system are obtained.
CN201711188795.3A 2017-11-24 2017-11-24 Binocular vision-based method for measuring three-dimensional vibration track of object to be measured Active CN107945166B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711188795.3A CN107945166B (en) 2017-11-24 2017-11-24 Binocular vision-based method for measuring three-dimensional vibration track of object to be measured

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711188795.3A CN107945166B (en) 2017-11-24 2017-11-24 Binocular vision-based method for measuring three-dimensional vibration track of object to be measured

Publications (2)

Publication Number Publication Date
CN107945166A CN107945166A (en) 2018-04-20
CN107945166B true CN107945166B (en) 2021-09-14

Family

ID=61948657

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711188795.3A Active CN107945166B (en) 2017-11-24 2017-11-24 Binocular vision-based method for measuring three-dimensional vibration track of object to be measured

Country Status (1)

Country Link
CN (1) CN107945166B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110617973B (en) * 2019-04-26 2021-08-27 深圳市豪视智能科技有限公司 Vibration detection method and related device
CN110111390A (en) * 2019-05-15 2019-08-09 湖南科技大学 Thin-wall part omnidirectional vibration measurement method and system based on binocular vision optical flow tracking
CN110245650B (en) * 2019-08-09 2019-12-03 深圳市广宁股份有限公司 Vibrate intelligent detecting method and Related product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101943566A (en) * 2009-07-07 2011-01-12 重庆工商大学 Method and device for measuring tiny two-dimensional displacement by computer camera
CN102142147A (en) * 2010-01-29 2011-08-03 索尼公司 Device and method for analyzing site content as well as device and method for detecting and tracking target
CN103954221A (en) * 2014-05-08 2014-07-30 哈尔滨工业大学 Binocular photogrammetry method of large flexible structure vibration displacement
CN107704814A (en) * 2017-09-26 2018-02-16 中国船舶重工集团公司第七〇九研究所 A kind of Vibration Targets monitoring method based on video

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101293040B1 (en) * 2012-05-22 2013-08-05 광주과학기술원 3d vibration measurement method and system using one vibrometer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101943566A (en) * 2009-07-07 2011-01-12 重庆工商大学 Method and device for measuring tiny two-dimensional displacement by computer camera
CN102142147A (en) * 2010-01-29 2011-08-03 索尼公司 Device and method for analyzing site content as well as device and method for detecting and tracking target
CN103954221A (en) * 2014-05-08 2014-07-30 哈尔滨工业大学 Binocular photogrammetry method of large flexible structure vibration displacement
CN107704814A (en) * 2017-09-26 2018-02-16 中国船舶重工集团公司第七〇九研究所 A kind of Vibration Targets monitoring method based on video

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
垂直光轴模型的双面阵CCD振动测试研究;歹英杰等;《光电工程》;20150315(第03期);全文 *

Also Published As

Publication number Publication date
CN107945166A (en) 2018-04-20

Similar Documents

Publication Publication Date Title
JP4809291B2 (en) Measuring device and program
CN111354042B (en) Feature extraction method and device of robot visual image, robot and medium
CN108416791B (en) Binocular vision-based parallel mechanism moving platform pose monitoring and tracking method
US20180066934A1 (en) Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
JP6734940B2 (en) Three-dimensional measuring device
CN108519102B (en) Binocular vision mileage calculation method based on secondary projection
CN112465877B (en) Kalman filtering visual tracking stabilization method based on motion state estimation
CN107945166B (en) Binocular vision-based method for measuring three-dimensional vibration track of object to be measured
JPWO2005124687A1 (en) Marker tracking method in optical motion capture system, optical motion capture method and system
JP6061770B2 (en) Camera posture estimation apparatus and program thereof
CN110827321B (en) Multi-camera collaborative active target tracking method based on three-dimensional information
US10652521B2 (en) Stereo camera and image pickup system
US20210327130A1 (en) Method and device for determining an area map
JP6922348B2 (en) Information processing equipment, methods, and programs
CN112966571A (en) Standing long jump flight height measurement method based on machine vision
CN110428461B (en) Monocular SLAM method and device combined with deep learning
CN109544584B (en) Method and system for realizing inspection image stabilization precision measurement
CN111105467A (en) Image calibration method and device and electronic equipment
CN107704814B (en) Vibration target monitoring method based on video
WO2021193672A1 (en) Three-dimensional model generation method and three-dimensional model generation device
JP5267100B2 (en) Motion estimation apparatus and program
JP2010009236A (en) Plane area estimation device and program
JP2004028811A (en) Device and method for correcting distance for monitoring system
JP4101478B2 (en) Human body end point detection method and apparatus
US9245343B1 (en) Real-time image geo-registration processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant