CN116608937A - Large flexible structure vibration mode identification method and test device based on computer vision - Google Patents

Large flexible structure vibration mode identification method and test device based on computer vision Download PDF

Info

Publication number
CN116608937A
CN116608937A CN202310566207.4A CN202310566207A CN116608937A CN 116608937 A CN116608937 A CN 116608937A CN 202310566207 A CN202310566207 A CN 202310566207A CN 116608937 A CN116608937 A CN 116608937A
Authority
CN
China
Prior art keywords
image
coordinate system
camera
target
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310566207.4A
Other languages
Chinese (zh)
Inventor
夏红伟
解源
李莉
马广程
安昊
考永贵
马长波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202310566207.4A priority Critical patent/CN116608937A/en
Publication of CN116608937A publication Critical patent/CN116608937A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01HMEASUREMENT OF MECHANICAL VIBRATIONS OR ULTRASONIC, SONIC OR INFRASONIC WAVES
    • G01H9/00Measuring mechanical vibrations or ultrasonic, sonic or infrasonic waves by using radiation-sensitive means, e.g. optical means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/03Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring coordinates of points
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M1/00Testing static or dynamic balance of machines or structures
    • G01M1/12Static balancing; Determining position of centre of gravity
    • G01M1/122Determining position of centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/34Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/36Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Nonlinear Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a large-scale flexible structure vibration mode identification method and a test device based on computer vision, and belongs to the technical field of simulation tests. The method comprises the following steps: modeling and calibrating a camera; collecting an image; preprocessing an image; extracting the outline of the cooperative target; calculating the centroid of the cooperative target; coordinate calculation of the cooperative target star; converting coordinates of the star of the cooperative target; kalman filtering treatment; and (5) spectrum analysis. Compared with the traditional vibration measurement method and device, the method and device have the characteristics of non-contact, high precision, low identification error, easiness in deployment and the like, not only can perform multi-point measurement in a global range, but also can realize high-precision measurement of vibration displacement of the flexible accessory on the premise of reducing the complexity of on-orbit equipment.

Description

Large flexible structure vibration mode identification method and test device based on computer vision
Technical Field
The invention relates to a large flexible structure vibration mode identification method and a test device based on computer vision, and belongs to the technical field of simulation tests.
Background
With the development of the aerospace technology, the structure of the space satellite is gradually complex, and the structure of the space satellite gradually presents the characteristic of carrying light and large-size flexible accessories. The large flexible accessory structure has the characteristics of light weight, low damping and small vibration frequency, is easy to be interfered by moment output components such as a flywheel, a pusher and the like, so that vibration with long time and low frequency is generated, the attitude pointing precision of a satellite is influenced, and the service life of a device is seriously influenced. Therefore, how to restrain the flexible vibration of the large flexible accessory is important, and the method for measuring and identifying the parameters of the flexible mode is the key of the flexible restraining technology.
The vibration measuring method for the large flexible accessory can be divided into a contact measuring technology and a non-contact measuring technology according to measuring means, and the contact measuring technology uses a sensor arranged on the accessory to obtain vibration displacement information of the large flexible accessory, but the mass distribution of the light flexible accessory can be influenced, and the dynamic characteristics of the light flexible accessory are changed, so that the measuring error is larger, and the vibration measuring method is not suitable for vibration measurement of the light large flexible accessory. The non-contact measurement technology indirectly measures vibration displacement information of the accessory by using an optical technology, so that multipoint measurement can be performed in a global range, and high-precision measurement of vibration displacement of the flexible accessory can be realized on the premise of reducing complexity of on-orbit equipment. Numerous studies have been carried out in the field of non-contact measurement at home and abroad, and Avitable et al measure vibration of a flexible structure by using a machine vision method and compare the vibration with measurement results of an accelerometer and a laser vibrometer to prove the feasibility of the vision measurement method. The Qia et al utilize the vision sensor to measure flexible piezoelectric cantilever plate bending and torsional mode parameter in real time, provide a reliable feedback measuring means for cantilever plate's stable control. And the NASA measures the vibration deformation information of the solar sailboard of the Hab telescope in the complex environment by using a camera measurement method.
Due to the limitation of the space environment of the on-orbit spacecraft and the specificity of the flexible accessory, the vibration parameters of the flexible accessory are usually obtained by adopting a non-contact measurement method based on a monocular camera in a feedback way, so that the method has great significance in researching and building a ground demonstration simulation system according to the real on-orbit environment of the spacecraft.
Disclosure of Invention
The invention aims to solve the problems in the prior art and further provides a large-scale flexible structure vibration mode identification method and a test device based on computer vision.
The invention aims at realizing the following technical scheme:
a large flexible structure vibration mode identification method based on computer vision comprises the following steps:
step one: camera modeling and calibration
Analyzing the imaging process of a camera, obtaining an internal and external parameter matrix of the camera through camera calibration, and establishing a monocular camera pinhole imaging model on the premise of not considering camera lens distortion;
step two: acquiring images
Acquiring an image containing a cooperative target on the flexible truss at a high speed in real time through an industrial camera, transmitting the image to a measurement PC through an image acquisition card, and carrying out subsequent image processing and coordinate resolving work by using the measurement PC;
step three: image preprocessing
The image preprocessing operation comprises setting a target area, image filtering, color recognition binarization and morphological processing, most of interference noise information in an image can be filtered through an image preprocessing flow, and foreground image information containing a characteristic target can be extracted from the image;
step four: collaborative target contour extraction
The binarized image subjected to image preprocessing only contains foreground information of the measurement cooperative targets, and contour information of each cooperative target needs to be extracted and separated so as to solve the centroid position of the target subsequently;
step five: collaborative target centroid calculation
For the extracted cooperative target contour point information, performing characteristic target centroid positioning by using a sub-pixel positioning method based on edge curve fitting;
step six: cooperative target star coordinate solution
Through the image preprocessing algorithm and the cooperative target centroid extraction algorithm, the central position information of a plurality of cooperative targets in the image is obtained, but because the camera in the system rotates along with the single-axis air floating platform, the imaging coordinate system is not fixed and the modal vibration identification can not be directly carried out from the target position information of the image, the image coordinate of the centroid is required to be converted in advance and unified under the static coordinate system, and then the vibration displacement curve is subjected to frequency spectrum analysis to identify the vibration modal parameters;
step seven: coordinate transformation of cooperative target star
Considering the situation when the single-axis air floating platform rotates, a static coordinate system and a star coordinate system are established, wherein the static coordinate system is defined as the star coordinate system when the single-axis air floating platform is static and the flexible truss is not stressed. When the single-axis air floating platform rotates for an angle theta, a rotation transformation relation is satisfied between a star coordinate system and a static coordinate system, and in order to identify modal vibration frequencies, the star coordinates at all times need to be unified under the static coordinate system.
Step eight: kalman filter processing
The vibration displacement curve under the static coordinate system contains interference noise signals, and the interference errors mainly come from sensor noise of the camera and the inertial measurement unit, so that the method adopts Kalman filtering to eliminate the sensor interference noise, and obtains an optimal estimation curve of the measurement value. Kalman filtering is a linear filtering and predicting method, which utilizes a linear system state equation to optimally estimate the system state through system input and output observation data.
Step nine: spectral analysis
A Discrete Fourier Transform (DFT) is used to perform a spectral analysis on the displacement curve. The discrete Fourier transform is an effective method for carrying out spectrum analysis on a limited-length discrete time domain sequence, discrete sequence information in a frequency domain is obtained by transforming discrete points of limited sampling, and the frequency domain composition of a sampling signal is obtained, wherein the largest main component frequencies are all the order frequencies of a vibration mode, so that the identification of the vibration mode parameters of a large flexible satellite is realized.
A test device of a large flexible structure vibration mode identification method based on computer vision comprises: the device comprises a measuring industrial personal computer, an industrial camera, a flexible accessory, a measuring cooperative target, a single-shaft air floatation platform and an air floatation base;
the single-shaft air flotation platform is provided with a measuring industrial personal computer and an industrial camera, the measuring industrial personal computer is connected with the industrial camera, the measuring cooperation targets are connected through flexible accessories, one side of each measuring cooperation target is arranged on the single-shaft air flotation platform, the other side of each measuring cooperation target is arranged on an air flotation base, and the air flotation base supports the flexible accessories and the measuring cooperation targets.
The beneficial effects of the invention are as follows:
compared with the traditional vibration measurement method and device, the method and device for identifying the vibration mode of the large flexible structure have the characteristics of non-contact, high precision, low identification error, easiness in deployment and the like. The vibration measuring method for the large flexible accessory can be divided into a contact measuring technology and a non-contact measuring technology according to measuring means, and the contact measuring technology uses a sensor arranged on the accessory to obtain vibration displacement information of the large flexible accessory, but can influence the mass distribution of the light flexible accessory and change the dynamic characteristics of the light flexible accessory, so that the measuring error is larger and the vibration measuring method is not suitable for vibration measurement of the light large flexible accessory; the non-contact measurement technology indirectly measures vibration displacement information of the accessory by using an optical technology, so that multipoint measurement can be performed in a global range, and high-precision measurement of vibration displacement of the flexible accessory can be realized on the premise of reducing complexity of on-orbit equipment.
Drawings
FIG. 1 is a flow chart of a method and a test device for identifying vibration modes of a large flexible structure based on computer vision.
FIG. 2 is a star coordinate system of the large-scale flexible structure vibration mode identification method and the test device based on computer vision.
Fig. 3 is a star coordinate system conversion relation between the large-scale flexible structure vibration mode identification method based on computer vision and the test device.
Fig. 4 is a schematic structural diagram of a test device for identifying vibration modes of a large flexible structure based on computer vision.
In the figure, reference numeral 1 is a measurement industrial personal computer, 2 is an industrial camera, 3 is a flexible accessory, 4 is a measurement cooperation target, 5 is a single-shaft air floating platform, and 6 is an air floating base.
Detailed Description
The invention will be described in further detail with reference to the accompanying drawings: the present embodiment is implemented on the premise of the technical solution of the present invention, and a detailed implementation is given, but the scope of protection of the present invention is not limited to the following embodiments.
As shown in fig. 1, a test apparatus for a large-scale flexible structure vibration mode identification method based on computer vision according to this embodiment includes: the measuring industrial personal computer 1, the industrial camera 2, the flexible accessory 3, the measuring cooperative target 4, the single-shaft air floating platform 5 and the air floating base 6;
the single-axis air flotation platform 5 is provided with a measuring industrial personal computer 1 and an industrial camera 2, the measuring industrial personal computer 1 is connected with the industrial camera 2, the measuring cooperation targets 4 are connected through flexible accessories 3, one side of each measuring cooperation target 4 is arranged on the single-axis air flotation platform 5, the other side of each measuring cooperation target 4 is arranged on an air flotation base 6, and the air flotation base 6 supports the flexible accessories 3 and the measuring cooperation targets 4.
Example 1
Step one: camera modeling and calibration
Analyzing the imaging process of a camera, obtaining an internal and external parameter matrix of the camera through camera calibration, and establishing a monocular camera pinhole imaging model on the premise of not considering camera lens distortion;
according to the theory of similar triangles in the pinhole imaging model, the corresponding relation between the image plane coordinate system and the camera coordinate system can be obtained:
wherein, (x, y, 1) T Is the homogeneous coordinates of the object point P in the image plane, (X) c ,Y c ,Z c ,1) T The homogeneous coordinates of the space object point P in the camera coordinates, and f is the focal length of the camera lens;
since the rigid transformation law is satisfied between the camera coordinate system and the world coordinate system, there is a correspondence relationship as follows:
wherein ,(Xw ,Y w ,Z w ,1) T For homogeneous coordinates of the space object point P in the world coordinate system, R and T are rotation matrixes and translation vectors from the world coordinate system to the camera coordinate system respectively;
since the pixel coordinate system and the image plane coordinate system satisfy the equal-ratio scaling relationship, and the zero point has a fixed offset, the following correspondence relationship exists:
wherein, (u, v, 1) T For homogeneous coordinates of the object point P in the pixel coordinate system, dx and dy are respectively the scaling coefficients in the transverse and longitudinal directions, u 0 and v0 The offset of the zero point in the transverse direction and the longitudinal direction respectively;
by combining the above relations, a camera pinhole imaging model as follows can be obtained:
in the above formula, let:
then there are:
wherein ,M1 、M 2 The method comprises the steps of respectively obtaining an internal parameter matrix and an external parameter matrix of the camera by using a Zhang calibration method, and constructing a monocular camera pinhole imaging model by using the internal parameter matrix and the external parameter matrix as priori conditions.
Step two: acquiring images
Acquiring an image containing a cooperative target on the flexible truss at a high speed in real time through an industrial camera, transmitting the image to a measurement PC through an image acquisition card, and carrying out subsequent image processing and coordinate resolving work by using the measurement PC;
step three: image preprocessing
The image preprocessing operation comprises setting a target area, image filtering, color recognition binarization and morphological processing, most of interference noise information in an image can be filtered through an image preprocessing flow, and foreground image information containing a characteristic target can be extracted from the image;
setting a target area: since the flexible accessory and the cooperative target only occupy a small part of the image, and most of the image is redundant environmental information, it is unnecessary to process all pixels of the whole image one by one, and therefore, a proper fixed target area is set for the acquired image first, so that all the cooperative targets on the flexible accessory are stable in the area of the image along with vibration of the flexible accessory.
And (3) image filtering: the purpose of image filtering is to filter noise information in an image, reduce interference of an ambient light source in the image and enable pixel change to be smoother; the principle of the method is to use Gaussian filtering to convolve each pixel point by using a Gaussian template, namely taking the pixel values of the n surrounding fields and taking Gaussian weights, and obtaining a new pixel value of the pixel point as a result. The mathematical formula expression of the Gaussian filter is shown as follows:
color recognition binarization: the purpose of the color recognition binarization is to extract foreground image information containing characteristic targets and distinguish foreground information from background information in the image. The principle is that firstly, according to HSV color space, the HSV space value range of characteristic target color is obtained, namely HSV color space corresponding to red color, and a range set is defined
Wherein H, s, v correspond to the hue, saturation and brightness of the pixel point respectively, and H min and Hmax Respectively representing the lower limit and the upper limit of the tone value, S min and Smax Respectively representing the lower limit and the upper limit of the saturation value, V min and Vmax Respectively representing the lower limit and the upper limit of the brightness value;
combining HSV three-channel values for each pixel in an image with the above rangeComparing, and judging the pixel of the point as a foreground point or a background point according to the comparison result, wherein the specific formula is as follows:
wherein ,for the above defined set of ranges, I Binary The gray scale of the binarized image is valued, 0 or 1,0 is white, 1 is black, I hsv And (5) taking the value of hsv three channels at the current pixel point.
Morphological treatment: the morphological treatment is mainly corrosion and expansion operation, and the isolated noise points in the obtained binary image can be eliminated through the morphological treatment, so that the extraction characteristics are more obvious, and the mark points can be conveniently extracted later; the principle of morphological processing is to use a specific kernel to convolve with the image to obtain a new value for each pixel.
The morphological erosion acts to expand the black portion of the image, the specific calculation formula of which is shown below:
the morphological dilation acts to dilate the white portion of the image, the specific calculation formula of which is shown below:
where (x, y) is the pixel point on the image, (x ', y') is the pixel point on the kernel, dst (x, y) is the calculated value at the pixel point (x, y), src (x, y) is the original value at the pixel point (x, y).
Step four: collaborative target contour extraction
The binarized image subjected to image preprocessing only contains foreground information of the measured cooperative targets, and contour information of each cooperative target needs to be extracted and separated so as to calculate the centroid position of the target subsequently.
Extracting edge information of each cooperative target by using a Canny edge detection algorithm, wherein the Canny edge detection algorithm mainly comprises the following parts: gaussian filtering, gradient calculation, non-maxima suppression and edge joining.
The two-dimensional gaussian matching image is filtered using the following formula to remove noise interference in the image.
Secondly, calculating the amplitude and the direction of the gray value gradient of the image, wherein a convolution operator adopted in a canny algorithm is shown in the following formula:
the operator is used for convoluting the image to obtain gradient information of each pixel point, and the larger the amplitude is, the more the gray level change of the point is, and the more likely the point is an edge point in the image.
For all possible edge points in the image, local edge points are obtained through non-maximum suppression, and the main process is as follows: searching the maximum value of each point in the gradient direction, which is a local extremum, marking the non-local extremum point as 0, and regularly cycling each pixel point in the image.
And finally, connecting the detected edge points into a closed contour, and firstly obtaining an edge image through a high threshold value, wherein the edge points are possibly not closed at the moment, so that low threshold value searching is further carried out on the end points of the edge image until the whole edge contour is closed, and the closed contour information in the image is obtained.
Step five: collaborative target centroid calculation
For the extracted cooperative target contour point information, a sub-pixel positioning method based on edge curve fitting is used for positioning the mass center of the characteristic target, and as the characteristic target is circular, a preset fitting curve L and a deviation function Q are respectively as follows:
L:x 2 +y 2 +ax+by+c=0
a, b and c are undetermined parameters of a quadratic fit curve L, and the fitting solution is a group of space vectors when the deviation function Q takes the minimum value;
and (3) carrying out optimization solving on fitting curve parameters according to a least square method principle, so that Q (a, b and c) respectively calculate partial derivatives of a, b and c and are equal to 0, and forming the following equation set:
obtaining a minimum point by solving the equation set, and finally obtaining the optimal estimation parameter of the fitting curve, namely the subpixel coordinates (x 0 ,y 0 )。
Step six: cooperative target star coordinate solution
Through the image preprocessing algorithm and the cooperative target centroid extraction algorithm, the central position information of a plurality of cooperative targets in the image is obtained, but because the camera in the system rotates along with the single-axis air floating platform, the imaging coordinate system is not fixed, and the modal vibration identification can not be directly carried out from the target position information of the image, the image coordinate of the centroid needs to be converted in advance and unified under the static coordinate system, and then the vibration displacement curve is subjected to frequency spectrum analysis to identify the vibration modal parameters.
The rigid connection end of the single-axis air floating platform and the flexible truss is used as a coordinate origin, and a star coordinate system is established by taking the tangential direction and the normal direction of the connection end as an x axis and a y axis respectively, the definition schematic diagram of the star coordinate system is shown as figure 2, and the star coordinate system rotates along with the single-axis air floating platform. Then, according to the monocular camera imaging model, the star coordinates of the cooperative target centroid obtained by the extraction can be obtained by resolving, specifically as follows:
the monocular camera pinhole imaging model established by the above is established by the following formula:
P p =M 1 P n
wherein ,Pp 、P n Normalized coordinates of the feature target centroid under the image coordinate system and under the camera coordinate system, M 1 The internal parameter matrix of the camera is obtained by camera calibration, and is a reversible matrix, and the inverse matrix is
wherein ,ax ,a y Is the scaling factor between the image coordinate system and the camera coordinate system, u 0 and v0 Offset vectors of coordinate origins between the image coordinate system and the camera coordinate system respectively;
then the characteristic target centroid can be obtained by back-pushing in a camera coordinate systemThe normalized coordinates P below n
According to the definition of the normalized coordinates, the following holds:
wherein ,Pc Camera coordinate system coordinates, Z, of the feature target centroid c For depth estimation information, the above formula requires the unknown Z c To calculate the coordinates in the camera coordinate system, the specific method is as follows:
according to the monocular camera pinhole imaging model, the following holds:
P c =M 2 P w
wherein ,M2 The external parameter matrix of the camera obtained by calibrating the camera is a reversible matrix, and the inverse matrix isP w Coordinates in a star coordinate system where the cooperative targets are located:
then the left and right sides are multiplied togetherThe method can obtain the following steps:
taking the third row, and expanding to obtain the following formula:
Z w =(r 31 X n +r 32 Y n +r 33 )Z c -(r 31 t 1 +r 32 t 2 +r 33 t 3 )
due to the artificially-defined star coordinate system in Z w =0 is truss movement plane, and Z is obtained after substitution c
Obtaining the unknown quantity Z c Then, the method can be carried into the middle to obtain the center of mass coordinate P of the cooperative target under the camera coordinate system c Then the camera coordinates are carried into the world coordinate system to obtain the characteristic target centroid coordinates P w And converting the two-dimensional image coordinates of the feature targets into star coordinates to obtain the coordinates of each cooperative target in the star coordinate system.
Step seven: coordinate transformation of cooperative target star
Considering the situation when the single-axis air bearing platform rotates, the relationship between the static coordinate system O-XYZ and the star coordinate system O-X ' Y ' Z ' is shown in FIG. 3, and the static coordinate system is defined as the star coordinate system when the single-axis air bearing platform is static and the flexible truss is not stressed. When the single-axis air floating platform rotates for an angle theta, a rotation transformation relation is satisfied between a star coordinate system and a static coordinate system, and in order to identify modal vibration frequencies, the star coordinates at all times need to be unified under the static coordinate system.
As the single-axis air floating platform only performs one-dimensional rotation motion, the relationship between the lower coordinate P 'of the O-X' Y 'Z' system and the lower coordinate P of the O-XYZ system is obtained by the rotation theorem of the rigid body:
P'=R Z P
wherein Rz Is a rotation matrix between coordinate systems:
and acquiring rotation angle information theta of the single-axis air floating platform at the current moment in real time through inertial component elements on the single-axis air floating platform for star coordinates P' of each cooperative target calculated in each frame of image, and unifying coordinate conversion under a static coordinate system through calculation so as to facilitate the follow-up modal vibration frequency identification work.
Step eight: kalman filter processing
The vibration displacement curve under the static coordinate system contains interference noise signals, and the interference errors mainly come from sensor noise of the camera and the inertial measurement unit, so that the method adopts Kalman filtering to eliminate the sensor interference noise, and obtains an optimal estimation curve of the measurement value. Kalman filtering is a linear filtering and predicting method, which utilizes a linear system state equation to optimally estimate the system state through system input and output observation data.
The state equation and the measurement equation of the system are as follows:
x k =Fx k-1 +Bu k-1 +w k-1
z k =Hx k +v k
wherein xk Is the state vector at time k; z k Is the measurement vector at time k; f is a one-step state transition matrix; u (u) k Is an input to the system; b is a matrix that converts the input vector into a state vector; h is a measurement model matrix; w (w) k ~N(0,Q k) and vk ~N(0,R k ) Is a gaussian distribution-compliant, uncorrelated process noise and metrology noise.
The recursive equation for Kalman filtering is given below:
wherein ,is state x k-1 Is filtered estimate of ∈K)>Is composed of->Calculated pair x k Is a one-step prediction of (a). In the system, according to the state variable x of the last moment k-1 Real-time vibration displacement information z from camera measurements k And calculating through the established system model to obtain the one-step predicted value of the target position and the speed.
wherein ,Kk Represents a filter gain matrix, P k|k-1 Is the covariance of the last optimal estimate and the current predicted value, P k and Pk-1 Representing the state vector at k and k-1, respectively, H k Representing a matrix of measurement models, R k Represents the noise variance, Q k Representing a noise matrix;
before the algorithm is run, proper filtering parameters including initial position and speed, noise characteristics of environment and the like are required to be given, and then target acceleration information acquired by inertial component elements on a single-axis air floatation platform and obtained after data processing is used as input u of each step k The vibration displacement from the camera measurements is taken as input z for each step k Substituting the information into an iterative equation to obtain the optimal estimation result of the vibration displacement of each step.
Step nine: spectral analysis
A Discrete Fourier Transform (DFT) is used to perform a spectral analysis on the displacement curve. The discrete Fourier transform is an effective method for carrying out spectrum analysis on a limited-length discrete time domain sequence, discrete sequence information in a frequency domain is obtained by transforming discrete points of limited sampling, and the frequency domain composition of a sampling signal is obtained, wherein the largest main component frequencies are all the order frequencies of a vibration mode, so that the identification of the vibration mode parameters of a large flexible satellite is realized.
Firstly, taking a fixed sampling number N for the obtained vibration displacement discrete information to form a discrete finite time sequence x (t), and converting the sequence into a discrete infinite sequence x (N), namely replacing an original time variable with a sequence number N; the discrete infinite sequence is truncated, and only a part of the discrete sequence is formed:
x(n)n=0,1,2,...,N-1.
performing discrete Fourier transform on the discrete sequences:
where k is a multiple of one frequency interval, and finally X (k) is mapped to X (f):
the frequency distribution condition of the discrete signal in the frequency domain f is obtained, so that the identification work of the vibration modal parameters of the flexible satellite accessory is realized.
In the foregoing, the present invention is merely preferred embodiments, which are based on different implementations of the overall concept of the invention, and the protection scope of the invention is not limited thereto, and any changes or substitutions easily come within the technical scope of the present invention as those skilled in the art should not fall within the protection scope of the present invention. Therefore, the protection scope of the present invention should be subject to the protection scope of the claims.

Claims (2)

1. The method for identifying the vibration mode of the large flexible structure based on computer vision is characterized by comprising the following steps of:
step one: camera modeling and calibration
Analyzing the imaging process of a camera, obtaining an internal and external parameter matrix of the camera through camera calibration, and establishing a monocular camera pinhole imaging model on the premise of not considering camera lens distortion;
according to the theory of similar triangles in the pinhole imaging model, the corresponding relation between the image plane coordinate system and the camera coordinate system can be obtained:
wherein, (x, y, 1) T Is the homogeneous coordinates of the object point P in the image plane, (X) c ,Y c ,Z c ,1) T The homogeneous coordinates of the space object point P in the camera coordinates, and f is the focal length of the camera lens;
since the rigid transformation law is satisfied between the camera coordinate system and the world coordinate system, there is a correspondence relationship as follows:
wherein ,(Xw ,Y w ,Z w ,1) T For homogeneous coordinates of the space object point P in the world coordinate system, R and T are rotation matrixes and translation vectors from the world coordinate system to the camera coordinate system respectively;
since the pixel coordinate system and the image plane coordinate system satisfy the equal-ratio scaling relationship, and the zero point has a fixed offset, the following correspondence relationship exists:
wherein, (u, v, 1) T For homogeneous coordinates of the object point P in the pixel coordinate system, dx and dy are respectively the scaling coefficients in the transverse and longitudinal directions, u 0 and v0 The offset of the zero point in the transverse direction and the longitudinal direction respectively;
by combining the above relations, a camera pinhole imaging model as follows can be obtained:
in the above formula, let:
then there are:
wherein ,M1 、M 2 The method comprises the steps of respectively obtaining an internal parameter matrix and an external parameter matrix of a camera by using a Zhang calibration method, and constructing a monocular camera pinhole imaging model by using the internal parameter matrix and the external parameter matrix as priori conditions;
step two: acquiring images
Acquiring an image containing a cooperative target on the flexible truss at a high speed in real time through an industrial camera, transmitting the image to a measurement PC through an image acquisition card, and carrying out subsequent image processing and coordinate resolving work by using the measurement PC;
step three: image preprocessing
The image preprocessing operation comprises setting a target area, image filtering, color recognition binarization and morphological processing, most of interference noise information in an image can be filtered through an image preprocessing flow, and foreground image information containing a characteristic target can be extracted from the image;
setting a target area: because the flexible accessory and the cooperative target only occupy a small part of the area of the image, and most of the image is redundant environmental information, the pixel points of the whole picture are unnecessary to be processed one by one, and therefore, a proper fixed target area is firstly set for the acquired picture, so that all the cooperative targets on the flexible accessory are stable in the area of the image along with the vibration of the flexible accessory;
and (3) image filtering: the purpose of image filtering is to filter noise information in an image, reduce interference of an ambient light source in the image and enable pixel change to be smoother; the method selects Gaussian filtering, and the principle is that each pixel point is convolved by using a Gaussian template, namely, the pixel values in the surrounding n fields are taken to carry out Gaussian weights, and the result is a new pixel value of the pixel point; the mathematical formula expression of the Gaussian filter is shown as follows:
color recognition binarization: the purpose of the color recognition binarization is to extract foreground image information containing characteristic targets and distinguish foreground information from background information in the images; the principle is that firstly, according to HSV color space, the HSV space value range of characteristic target color is obtained, namely HSV color space corresponding to red color, and a range set is defined
Wherein H, s, v correspond to the hue, saturation and brightness of the pixel point respectively, and H min and Hmax Respectively representing the lower limit and the upper limit of the tone value, S min and Smax Respectively representing the lower limit and the upper limit of the saturation value, V min and Vmax Respectively representing the lower limit and the upper limit of the brightness value;
combining HSV three-channel values for each pixel in an image with the above rangeComparing, and judging the point according to the comparison resultThe pixels are foreground points or background points, and the specific formula is as follows:
wherein ,for the above defined set of ranges, I Binary The gray scale of the binarized image is valued, 0 or 1,0 is white, 1 is black, I hsv The hsv three-channel value at the current pixel point is taken;
morphological treatment: the morphological treatment is mainly corrosion and expansion operation, and the isolated noise points in the obtained binary image can be eliminated through the morphological treatment, so that the extraction characteristics are more obvious, and the mark points can be conveniently extracted later; the principle of morphological processing is to use a specific kernel to carry out convolution operation with an image to obtain a new value of each pixel point;
the morphological erosion acts to expand the black portion of the image, the specific calculation formula of which is shown below:
the morphological dilation acts to dilate the white portion of the image, the specific calculation formula of which is shown below:
wherein (x, y) is a pixel point on the image, (x ', y') is a pixel point on the kernel, dst (x, y) is a calculated value at the pixel point (x, y), src (x, y) is an original value at the pixel point (x, y);
step four: collaborative target contour extraction
The binarized image subjected to image preprocessing only contains foreground information of the measurement cooperative targets, and contour information of each cooperative target needs to be extracted and separated so as to solve the centroid position of the target subsequently;
extracting edge information of each cooperative target by using a Canny edge detection algorithm, wherein the Canny edge detection algorithm mainly comprises the following parts: gaussian filtering, gradient calculation, non-maximum suppression and edge connection;
filtering the two-dimensional Gaussian check image by using the following formula to remove noise interference in the image;
secondly, calculating the amplitude and the direction of the gray value gradient of the image, wherein a convolution operator adopted in a canny algorithm is shown in the following formula:
the operator is used for convoluting the image to obtain gradient information of each pixel point, and the more the amplitude is, the more the gray level change of the point is severe, and the point is more likely to be an edge point in the image;
for all possible edge points in the image, local edge points are obtained through non-maximum suppression, and the main process is as follows: searching the maximum value of each point in the gradient direction, which is a local extremum, marking the non-local extremum point as 0, and regularly cycling each pixel point in the image;
finally, connecting the detected edge points into a closed contour, firstly obtaining an edge image through a high threshold value, wherein the edge points are possibly not closed at the moment, so that low threshold value searching is further carried out on the end points of the edge image until the whole edge contour is closed, and closed contour information in the image is obtained;
step five: collaborative target centroid calculation
For the extracted cooperative target contour point information, a sub-pixel positioning method based on edge curve fitting is used for positioning the mass center of the characteristic target, and as the characteristic target is circular, a preset fitting curve L and a deviation function Q are respectively as follows:
L:x 2 +y 2 +ax+by+c=0
a, b and c are undetermined parameters of a quadratic fit curve L, and the fitting solution is a group of space vectors when the deviation function Q takes the minimum value;
and (3) carrying out optimization solving on fitting curve parameters according to a least square method principle, so that Q (a, b and c) respectively calculate partial derivatives of a, b and c and are equal to 0, and forming the following equation set:
obtaining a minimum point by solving the equation set, and finally obtaining the optimal estimation parameter of the fitting curve, namely the subpixel coordinates (x 0 ,y 0 );
Step six: cooperative target star coordinate solution
Through the image preprocessing algorithm and the cooperative target centroid extraction algorithm, the central position information of a plurality of cooperative targets in the image is obtained, but because the camera in the system rotates along with the single-axis air floating platform, the imaging coordinate system is not fixed and the modal vibration identification can not be directly carried out from the target position information of the image, the image coordinate of the centroid is required to be converted in advance and unified under the static coordinate system, and then the vibration displacement curve is subjected to frequency spectrum analysis to identify the vibration modal parameters;
the method comprises the steps that a rigid connection end of a single-axis air floating platform and a flexible truss is used as a coordinate origin, a tangential direction and a normal direction of the connection end are respectively used as an x axis and a y axis to establish a star coordinate system, the star coordinate system rotates along with the single-axis air floating platform, and an industrial camera is fixedly connected to the single-axis air floating platform through a bracket, so that the star coordinate system and the camera meet a rigid transformation relation, and the star coordinate system is used as a world coordinate system in camera calibration; then, according to the monocular camera imaging model, the star coordinates of the cooperative target centroid obtained by the extraction can be obtained by resolving, specifically as follows:
the monocular camera pinhole imaging model established by the above is established by the following formula:
P p =M 1 P n
wherein ,Pp 、P n Normalized coordinates of the feature target centroid under the image coordinate system and under the camera coordinate system, M 1 The internal parameter matrix of the camera is obtained by camera calibration, and is a reversible matrix, and the inverse matrix is
wherein ,ax ,a y Is the scaling factor between the image coordinate system and the camera coordinate system, u 0 and v0 Offset vectors of coordinate origins between the image coordinate system and the camera coordinate system respectively;
the normalized coordinate P of the characteristic target centroid under the camera coordinate system can be obtained by back-pushing n
According to the definition of the normalized coordinates, the following holds:
wherein ,Pc Camera coordinate system coordinates, Z, of the feature target centroid c For depth estimation information, the above formula requires the unknown Z c To calculate the coordinates in the camera coordinate system, the specific method is as follows:
according to the monocular camera pinhole imaging model, the following holds:
P c =M 2 P w
wherein ,M2 The external parameter matrix of the camera obtained by calibrating the camera is a reversible matrix, and the inverse matrix isP w Coordinates in a star coordinate system where the cooperative targets are located:
then the left and right sides are multiplied togetherThe method can obtain the following steps:
taking the third row, and expanding to obtain the following formula:
Z w =(r 31 X n +r 32 Y n +r 33 )Z c -(r 31 t 1 +r 32 t 2 +r 33 t 3 )
due to the artificially-defined star coordinate system in Z w =0 is truss movement plane, and Z is obtained after substitution c
Obtaining the unknown quantity Z c Then, the method can be carried into the middle to obtain the center of mass coordinate P of the cooperative target under the camera coordinate system c Then the camera coordinates are carried into the world coordinate system to obtain the characteristic target centroid coordinates P w Converting the two-dimensional image coordinates of the feature targets into star coordinates to obtain the coordinates of each cooperative target in a star coordinate system;
step seven: coordinate transformation of cooperative target star
Establishing a static coordinate system O-XYZ and a star coordinate system O-X ' Y ' Z ' by considering the rotation condition of the single-axis air floating platform, wherein the static coordinate system is defined as a star coordinate system when the single-axis air floating platform is static and the flexible truss is not stressed; when the single-axis air flotation platform rotates for an angle theta, a rotation transformation relation is satisfied between a star coordinate system and a static coordinate system, and in order to identify modal vibration frequency, the star coordinates at all times need to be unified under the static coordinate system;
as the single-axis air floating platform only performs one-dimensional rotation motion, the relationship between the lower coordinate P 'of the O-X' Y 'Z' system and the lower coordinate P of the O-XYZ system is obtained by the rotation theorem of the rigid body:
P'=R Z P
wherein Rz Is a rotation matrix between coordinate systems:
the star coordinates P' of each cooperative target obtained by calculation in each frame of image are obtained in real time through inertial component elements on the single-axis air floatation platform, and the rotation angle information theta of the single-axis air floatation platform at the current moment can be converted into coordinates under a static coordinate system by calculation so as to facilitate the follow-up modal vibration frequency identification work;
step eight: kalman filter processing
The vibration displacement curve under the static coordinate system contains interference noise signals, and the interference errors mainly come from sensor noise of a camera and an inertial component, so that the method adopts Kalman filtering to eliminate the sensor interference noise and obtain an optimal estimation curve of a measurement value; kalman filtering is a linear filtering and predicting method, which utilizes a linear system state equation to optimally estimate the system state through system input and output observation data;
the state equation and the measurement equation of the system are as follows:
x k =Fx k-1 +Bu k-1 +w k-1
z k =Hx k +v k
wherein xk Is the state vector at time k; z k Is the measurement vector at time k; f is a one-step state transition matrix; u (u) k Is an input to the system; b is a matrix that converts the input vector into a state vector; h is a measurement model matrix; w (w) k ~N(0,Q k) and vk ~N(0,R k ) Is gaussian distribution compliant, uncorrelated process noise and metrology noise;
the recursive equation for Kalman filtering is given below:
wherein ,is state x k-1 Is filtered estimate of ∈K)>Is composed of->Calculated pair x k Is predicted in one step; in the system, according to the state variable x of the last moment k-1 Real-time vibration from camera measurementsDisplacement information z k Calculating through the established system model to obtain a one-step predicted value of the target position and the speed;
wherein ,Kk Represents a filter gain matrix, P k|k-1 Is the covariance of the last optimal estimate and the current predicted value, P k and Pk-1 Representing the state vector at k and k-1, respectively, H k Representing a matrix of measurement models, R k Represents the noise variance, Q k Representing a noise matrix;
before the algorithm is run, proper filtering parameters including initial position and speed, noise characteristics of environment and the like are required to be given, and then target acceleration information acquired by inertial component elements on a single-axis air floatation platform and obtained after data processing is used as input u of each step k The vibration displacement from the camera measurements is taken as input z for each step k Substituting the information into an iterative equation to obtain an optimal estimation result of the vibration displacement of each step;
step nine: spectral analysis
Performing spectrum analysis on the displacement curve by adopting Discrete Fourier Transform (DFT); the discrete Fourier transform is an effective method for carrying out spectrum analysis on a limited-length discrete time domain sequence, discrete sequence information in a frequency domain is obtained by transforming discrete points of limited sampling, and frequency domain components of the sampling signal are obtained, wherein the largest several principal component frequencies are all order frequencies of a vibration mode, so that the identification of the vibration mode parameters of the large flexible satellite is realized;
firstly, taking a fixed sampling number N for the obtained vibration displacement discrete information to form a discrete finite time sequence x (t), and converting the sequence into a discrete infinite sequence x (N), namely replacing an original time variable with a sequence number N; the discrete infinite sequence is truncated, and only a part of the discrete sequence is formed:
x(n)n=0,1,2,...,N-1.
performing discrete Fourier transform on the discrete sequences:
where k is a multiple of one frequency interval, and finally X (k) is mapped to X (f):
the frequency distribution condition of the discrete signal in the frequency domain f is obtained, so that the identification work of the vibration modal parameters of the flexible satellite accessory is realized.
2. The test device for the large-scale flexible structure vibration mode identification method based on computer vision as claimed in claim 1, which is characterized by comprising: the device comprises a measuring industrial personal computer (1), an industrial camera (2), a flexible accessory (3), a measuring cooperative target (4), a single-shaft air floating platform (5) and an air floating base (6);
wherein, be provided with on unipolar air supporting platform (5) and measure industrial computer (1) and industry camera (2), measure industrial computer (1) and be connected with industry camera (2), measure and be connected through flexible annex (3) between cooperation target (4), measure cooperation target (4) one side setting on unipolar air supporting platform (5), measure cooperation target (4) opposite side setting on air supporting base (6), air supporting base (6) support flexible annex (3) and measure cooperation target (4).
CN202310566207.4A 2023-05-18 2023-05-18 Large flexible structure vibration mode identification method and test device based on computer vision Pending CN116608937A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310566207.4A CN116608937A (en) 2023-05-18 2023-05-18 Large flexible structure vibration mode identification method and test device based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310566207.4A CN116608937A (en) 2023-05-18 2023-05-18 Large flexible structure vibration mode identification method and test device based on computer vision

Publications (1)

Publication Number Publication Date
CN116608937A true CN116608937A (en) 2023-08-18

Family

ID=87681247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310566207.4A Pending CN116608937A (en) 2023-05-18 2023-05-18 Large flexible structure vibration mode identification method and test device based on computer vision

Country Status (1)

Country Link
CN (1) CN116608937A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117893610A (en) * 2024-03-14 2024-04-16 四川大学 Aviation assembly robot gesture measurement system based on zoom monocular vision

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117893610A (en) * 2024-03-14 2024-04-16 四川大学 Aviation assembly robot gesture measurement system based on zoom monocular vision
CN117893610B (en) * 2024-03-14 2024-05-28 四川大学 Aviation assembly robot gesture measurement system based on zoom monocular vision

Similar Documents

Publication Publication Date Title
CN107833249B (en) Method for estimating attitude of shipboard aircraft in landing process based on visual guidance
CN111563878B (en) Space target positioning method
CN110942458A (en) Temperature anomaly defect detection and positioning method and system
JP2021168143A (en) System and method for efficiently scoring probe in image by vision system
Zhou et al. Fast star centroid extraction algorithm with sub-pixel accuracy based on FPGA
CN116608937A (en) Large flexible structure vibration mode identification method and test device based on computer vision
CN110018170B (en) Honeycomb model-based aircraft skin small damage positioning method
CN110363758B (en) Optical remote sensing satellite imaging quality determination method and system
CN114993452B (en) Structure micro-vibration measurement method and system based on broadband phase motion amplification
CN110852213A (en) Template matching-based pointer instrument multi-condition automatic reading method
Cabo et al. A hybrid SURF-DIC algorithm to estimate local displacements in structures using low-cost conventional cameras
CN113658147B (en) Workpiece size measuring device and method based on deep learning
CN113340405B (en) Bridge vibration mode measuring method, device and system
Wang et al. Infrared Earth sensor with a large field of view for low-Earth-orbiting micro-satellites
CN111735447B (en) Star-sensitive-simulated indoor relative pose measurement system and working method thereof
CN111815580B (en) Image edge recognition method and small module gear module detection method
CN112819935A (en) Method for realizing three-dimensional reconstruction of workpiece based on binocular stereo vision
CN117405279A (en) Unmanned plane platform and reference-free correction-based cable force measurement method and system
Zhou et al. Robust and high-precision vision system for deflection measurement of crane girder with camera shake reduction
CN116363121A (en) Computer vision-based inhaul cable force detection method, system and device
US11748442B2 (en) Image matching device
Ning et al. Spacecraft angular velocity estimation method using optical flow of stars
Wang et al. Facilitating PTZ camera auto-calibration to be noise resilient with two images
CN114062265A (en) Method for evaluating stability of supporting structure of visual system
Dikmen Development of star tracker attitude and position determination system for spacecraft maneuvering and docking facility

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination