CN117722972A - Visual measurement method for large-span discrete point relative deformation of movable base - Google Patents

Visual measurement method for large-span discrete point relative deformation of movable base Download PDF

Info

Publication number
CN117722972A
CN117722972A CN202311767061.6A CN202311767061A CN117722972A CN 117722972 A CN117722972 A CN 117722972A CN 202311767061 A CN202311767061 A CN 202311767061A CN 117722972 A CN117722972 A CN 117722972A
Authority
CN
China
Prior art keywords
camera
point
cameras
measurement
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311767061.6A
Other languages
Chinese (zh)
Inventor
胡丙华
李宏
李宁宁
惠广裕
刘好光
孙彦狮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chinese Flight Test Establishment
Original Assignee
Chinese Flight Test Establishment
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chinese Flight Test Establishment filed Critical Chinese Flight Test Establishment
Priority to CN202311767061.6A priority Critical patent/CN117722972A/en
Publication of CN117722972A publication Critical patent/CN117722972A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the technical field of dynamic vision measurement, in particular to a vision measurement method for large-span discrete point relative deformation of a movable base. The real-time accurate measurement of the relative pose of the large-span discrete point of the movable base is realized through the links of synchronous acquisition, storage, automatic detection and registration of the mark points, real-time pose correction of the cameras, monocular dynamic pose calculation, combined camera parameter transmission and the like of a plurality of cameras.

Description

Visual measurement method for large-span discrete point relative deformation of movable base
Technical Field
The invention belongs to the technical field of dynamic vision measurement, and particularly relates to a vision measurement method for large-span discrete point relative deformation of a movable base.
Background
The vision measurement method has the important advantages of high precision, non-contact, low cost and the like, and has wide application fields; the method is currently applied to continuous motion measurement of single point or multiple points capable of imaging in the same field of view on a measured object so as to obtain parameters such as position and gesture under an object coordinate system, and is rarely applied to dynamic reference measurement of multiple discrete points which are large in span and cannot be imaged in the same field of view, but in actual engineering, the requirements for measuring the three-dimensional position and gesture of the large-span discrete points are greatly met, such as ship deformation measurement. The requirements for accurate attitude reference information are continuously increased along with the increase of range (acting distance) and the improvement of accuracy requirements of a modern ship-based weapon system and a measurement guiding system, so that the research on a visual measurement method of the large-span discrete point relative deformation of a movable base is of great significance for eliminating or weakening errors caused by hull deformation.
The deformation of the ship body is the elastic deformation of the ship body at different positions due to complex sea conditions and under the action of various stresses when the ship is actually sailed. The most commonly used real-time measurement methods for hull deformation at present are an inertial measurement method, a GPS measurement method and a photogrammetry method. Among them, the photogrammetry is widely used with its important advantages of non-contact, high precision, high real-time, low cost, etc. The classical method is that the pose transmission is carried out by arranging a multi-stage transmission station structure imager chain, the method can be used for measuring the relative pose and the variation of the relative pose between objects with different views or ultra-large viewing angles, and the method expands the application prospect of photogrammetry. However, the method has lower measurement precision to be improved, in terms of system layout, the multi-stage transmission pose measurement of the blind measurement points is realized by fixedly arranging the three-dimensional markers and the cameras at the same station, the cameras and the industrial personal computer are placed nearby, the layout mode is not suitable for hull deformation measurement in the aircraft landing test, in the test, the ship deck surface is not allowed to be provided with the projections in a large range, and workers also need to work in a cabin, so that the existing method is necessary to be improved to enhance the engineering applicability of hull deformation vision measurement.
Disclosure of Invention
The purpose of the invention is that: a visual measurement method for large-span discrete point relative deformation of a movable base is provided, so that the measurement accuracy and engineering applicability of the existing measurement method are improved.
The technical scheme of the invention is as follows:
in order to achieve the above purpose, a visual measurement method for large-span discrete point relative deformation of a movable base is provided, which comprises the following steps:
step 1: determining a measurement reference point and a point to be measured, and densely distributing coding mark points at the measurement reference point and the point to be measured respectively; the camera measuring stations are arranged on the movable base, each camera measuring station at least comprises two cameras, one of the cameras is a reference camera, the reference camera is a camera for shooting a measurement reference point position in each camera measuring station, or a camera for shooting a point to be measured relatively close to the measurement reference point position, and the camera measuring station comprising the camera for shooting the measurement reference point position is recorded as a reference camera measuring station; each camera is used for shooting all coding mark points at the measurement reference point position or the point to be measured independently;
preferably, when the camera in the single camera measuring station cannot shoot the coverage measurement reference point position and all the points to be measured, a plurality of camera measuring stations are arranged.
Preferably, under the condition that the shooting view field range of each camera is determined, when one shooting measuring station shoots two adjacent points, and the coding mark point of the position of the shooting measuring station is smaller than 1/3 of the view field range in the corresponding shooting camera, an auxiliary measuring point is needed to be added between the two points, and the auxiliary measuring point is equal to the point to be measured in the calculating process.
Preferably, the coding mark points are arranged in a circular area range with each point as a circle center and the radius of 0.5-1 m.
Preferably, the code mark points are subjected to stretching deformation processing according to the shooting distance and the shooting angle. The operation is to image the coding mark point to be close to a perfect circle, which is beneficial to automatic detection and identification of the coding mark point.
Step 2: performing internal parameter calibration of all cameras to obtain internal azimuth elements and lens distortion correction parameters of all cameras;
step 3: arranging each camera external parameter calibration control point, and obtaining coordinates of the camera external parameter calibration control points and all coding mark points under an object coordinate system through a coordinate measuring device;
preferably, the object space coordinate system is a space reference coordinate system of the platform where the measurement reference point and the point to be measured are located.
Step 4: collecting external parameter calibration images of each camera, and performing camera external parameter calibration processing to obtain initial external azimuth elements of all cameras;
step 5: calculating a space pose conversion matrix of other cameras of each shooting measurement station relative to the reference camera based on the initial external azimuth element of each camera;
the calculation model of other cameras of the camera measuring station relative to the reference camera is shown as (1):
wherein T is 0 And R is 0 Pose matrix formed by initial external azimuth elements of reference camera of certain camera measuring station, T i And R is i Pose matrix formed by initial external azimuth elements of other cameras of the camera measuring station, T 0-i And R is 0-i The pose conversion matrix of other cameras of the shooting and measuring station relative to the reference camera is provided, and n is the number of other cameras of the shooting and measuring station;
step 6: synchronously acquiring image data to obtain images of coding mark points at each point at the same moment in a dynamic test;
step 7: the coding mark points are detected, positioned, identified and registered in real time, mark point coding information and image coordinates thereof in the view field of the corresponding camera are obtained, and a one-to-one correspondence relation between the image coordinates of the coding mark points detected in real time and the coordinates of the object coordinate system obtained in the step 3 is established;
step 8: calculating dynamic external azimuth elements of a reference camera of the reference camera measuring station; obtaining a dynamic external azimuth element of a reference camera of the reference camera measuring station based on a space rear intersection principle according to an internal azimuth element and a lens distortion correction coefficient of the reference camera measuring station, image coordinates of a coding mark point at a measuring reference point position and corresponding coordinates under an object coordinate system;
step 9: calculating a pose matrix formed by other camera dynamic external azimuth elements of the camera measuring station in the previous step; according to the pose conversion matrix of the other cameras of the photographing measuring station relative to the reference camera obtained by calculation in the step 5 and the dynamic external azimuth element of the reference camera obtained by calculation in the step 5, obtaining the dynamic pose matrix of the other cameras of the station;
the dynamic pose matrix calculation model of other cameras of the camera measuring station is as follows:
wherein T' 0 ,R′ 0 Pose matrix T formed by dynamic external azimuth elements of reference camera of camera measuring station A 0-i And R is 0-i The pose conversion matrix of other cameras of the shooting measuring station relative to the reference camera is used for the pose conversion matrix; t'. i ,R′ i And (5) dynamically positioning the pose matrix for other cameras of the camera measuring station.
Step 10: calculating the pose of the point to be measured shot by other cameras of the shooting measuring station in the previous step relative to the measuring reference point; and obtaining the pose of the point to be measured relative to the measurement reference point based on a single-station 6D resolving principle according to the internal azimuth elements, lens distortion correction coefficients, dynamic pose matrix, coded mark point image coordinates of the point to be measured and coordinates of the point to be measured under an object coordinate system of the coded mark point image coordinates of the point to be measured of other cameras of the camera measurement station in the previous step.
The specific calculation process comprises the following steps:
constructing an error equation based on the formula (3), and carrying out least square adjustment calculation to obtain pose data of the point to be measured relative to the measurement reference point;
wherein,the image coordinates of the coded mark point image coordinates after lens distortion error correction are represented by the image plane coordinates, f is the focal length of the camera lens, lambda is the scale coefficient, and can be eliminated by formula conversion, (X) t ,Y t ,Z t ) T is the coordinate of the mark point in the object coordinate system Pi ,R Pi A translation and rotation matrix formed by pose parameters of the point to be measured relative to the measurement reference point in an object coordinate system is formed, (X) P ,Y P ,Z P ) And (X) Pt ,Y Pt ,Z Pt ) The image space auxiliary coordinate and the object space auxiliary coordinate are obtained by multiplying the image space coordinate and the object space coordinate by a rotation matrix respectively.
Step 11: calculating a new coordinate of the coding mark point at the point to be detected in the last step under the object coordinate system;
the specific calculation process comprises the following steps:
based on the formula (4), obtaining a new coordinate of the coding mark point at the point to be detected under the object coordinate system
Step 12: calculating a dynamic pose matrix of a reference camera in an adjacent camera measurement station; obtaining a dynamic pose matrix of the camera based on a rear intersection principle according to internal azimuth elements and lens distortion correction coefficients of reference cameras in adjacent camera measurement stations, image coordinates of coding mark points at the point to be measured in the previous step in the camera imaging and new coordinates under an object coordinate system obtained by calculation in the previous step;
step 13: judging whether the to-be-measured point shot by other cameras in the camera measuring station in the last step has the last to-be-measured point or not; if yes, repeating the step 9-10, and ending the calculation at the current moment; if not, repeating the steps 9-13;
and 14, entering synchronous acquisition and relative deformation measurement calculation of image data at the next moment, namely repeating the steps 6-13 until the test is finished.
The invention has the advantages that: the invention provides a visual measurement method for large-span discrete point relative deformation of a movable base. The real-time accurate measurement of the relative pose of the large-span discrete point of the movable base is realized through the links of synchronous acquisition, storage, automatic detection and registration of coding marker points, real-time pose correction of the cameras, monocular dynamic pose calculation, combined camera parameter transmission and the like of a plurality of cameras. The invention fully plays the advantages of non-contact, visual, high-precision and the like of vision measurement, solves the problem of accurate measurement of the relative deformation of a large-span discrete point of a movable base based on the modes of combined camera relay measurement, local rigid body 6D measurement and integral deformation measurement on the basis of the conventional method, realizes the real-time monitoring of the pose changes of a plurality of key points of a ship body under the condition of not influencing the normal operation of the deck surface of the ship, provides a reliable basis for the real-time correction of system references such as landing guidance, target tracking and the like, and has strong practicability.
Description of the drawings:
in order to more clearly illustrate the technical solutions of the embodiments of the present invention, the following description will briefly explain the drawings required to be used in the embodiments of the present invention, and it is obvious that the drawings described below are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method of a preferred embodiment of the present invention;
FIG. 2 is a schematic diagram of a visual measurement method of large-span discrete point relative deformation of a movable base according to a first embodiment of the present invention;
fig. 3 is a schematic diagram of a visual measurement method of large-span discrete point relative deformation of a moving base according to a second embodiment of the present invention.
The specific embodiment is as follows:
in order to make the objects, technical solutions and advantages of the embodiments of the present invention more clear, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It will be apparent that the described embodiments are some, but not all, embodiments of the invention. All other embodiments obtained by those skilled in the art based on the embodiments of the present invention without making any inventive effort are intended to fall within the scope of the present invention.
Features and exemplary embodiments of various aspects of the invention are described in detail below. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the invention by showing examples of the invention. The present invention is in no way limited to any particular arrangement and method set forth below, but rather covers any adaptations, alternatives, and modifications of structure, method, and device without departing from the spirit of the invention. In the drawings and the following description, well-known structures and techniques have not been shown in detail in order not to unnecessarily obscure the present invention.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other, and the embodiments may be referred to and cited with each other. The invention will be described in detail below with reference to the drawings in connection with embodiments.
The application provides a visual measurement method for large-span discrete point relative deformation of a movable base, and please refer to fig. 2.
Example 1
As shown in FIG. 2, when a camera in a single camera measuring station can shoot and cover a measuring reference point position and all points to be measured, the visual measuring method specifically comprises the following steps:
step 1: determining a measurement reference point P 0 And the point to be tested P ci Densely distributing coding mark points at the measuring reference point and the point to be measured respectively, wherein the coding mark points are distributed in a circular area range with the radius of 0.5-1m by taking each point as a circle center, and carrying out stretching deformation treatment on the coding mark points according to the change of shooting distance and shooting angle; the camera measuring station is arranged on the movable base and at least comprises two cameras which are fixedly connected with each otherThe method comprises the steps of carrying out a first treatment on the surface of the Each camera is used for shooting all coding mark points at the measurement reference point position or the point to be measured independently; the camera shooting the measurement reference point is a reference camera;
step 2: performing camera internal parameter calibration to obtain internal azimuth elements and lens distortion correction parameters of all cameras;
step 3: arranging each camera external parameter calibration control point, and obtaining coordinates of the camera external parameter calibration control points and all coding mark points under an object space coordinate system by a coordinate measuring device, wherein the object space coordinate system is a space reference coordinate system (appointed by a measurement demand proposer) of a platform where the measurement reference point and the point to be measured are located;
step 4: collecting external reference calibration images of each camera, and performing camera external reference calibration processing to obtain initial external orientation elements of all cameras, wherein the initial external orientation elements comprise three angle elements and three line elements (X S ,Y S ,Z S );
Step 5: calculating a space pose conversion matrix of other cameras of the camera measurement station relative to the reference camera based on the initial external azimuth element of each camera;
the calculation model is shown as formula (1):
wherein T is 0 And R is 0 Pose matrix formed by initial external azimuth elements of reference camera, T i And R is i Pose matrix formed by initial external azimuth elements of other cameras, T 0-i And R is 0-i The pose conversion matrix is a pose conversion matrix of other cameras relative to the reference camera, and n is the number of points to be detected;
step 6: synchronously acquiring image data to obtain the measurement datum point P at the same moment in dynamic test 0 Bit, and bit under test P ci An image of the encoded marker point at;
step 7: the coding mark points are detected, positioned, identified and registered in real time, mark point coding information and image coordinates thereof in the view field of the corresponding camera are obtained, and a one-to-one correspondence relation between the image coordinates of the coding mark points detected in real time and the coordinates of the object coordinate system obtained in the step 3 is established;
step 8: calculating dynamic external azimuth elements of the reference camera; correction coefficient, P, according to internal azimuth element and lens distortion of reference camera 0 The image coordinates of the coded mark points and the corresponding coordinates under the object coordinate system are used for obtaining dynamic external azimuth elements of the reference camera based on the principle of space rear intersection;
step 9: calculating pose matrixes formed by dynamic external azimuth elements of other cameras; obtaining the dynamic pose matrix of other cameras according to the pose conversion matrix of the other cameras relative to the reference camera calculated in the step 5 and the dynamic external azimuth element of the reference camera calculated in the step 8;
the dynamic pose matrix calculation model of other cameras is as follows:
wherein T' 0 ,R′ 0 Pose matrix formed by dynamic external azimuth elements of reference camera, T 0-i And R is 0-i The pose conversion matrix of other cameras of the shooting measuring station relative to the reference camera is used for the pose conversion matrix; t'. i ,R′ i Dynamic pose matrix for other cameras.
Relationship between pose matrix and camera external orientation element:
step 10: calculating the point to be measured P shot by other cameras ci Relative to the reference point P 0 Pose of the position; according to the internal azimuth element, lens distortion correction coefficient, dynamic pose matrix and P of other cameras ci The coordinates of the coded mark point image and the coordinates under the object coordinate system thereof are based on the single-station 6D resolving principle to obtain P ci Relative P 0 The position and orientation (phi, omega,K,ΔX,ΔY,ΔZ) i the relative deformation measurement of all the points to be measured is realized.
The specific calculation process comprises the following steps:
1) Constructing an equivalent relation:
wherein,the image coordinates of the coded mark point image coordinates after lens distortion error correction are represented by the image plane coordinates, f is the focal length of the camera lens, lambda is the scale coefficient, and can be eliminated by formula conversion, (X) t ,Y t ,Z t ) To code the coordinates of the marker points in the object coordinate system, T Pi ,R Pi Is P ci Bit relative P 0 And a translation and rotation matrix formed by pose parameters under an object space coordinate system.
2) Calculating single point to be measured
Column error equation
And respectively solving partial derivatives based on error equation pairs (phi, omega, K, delta X, delta Y and delta Z) to obtain an error equation:
combining error equations of the coding mark point shape as shown in the formula (5) at each point to be detected to form an error equation set;
calculating least square adjustment to obtain the relative P of each point to be measured 0 Pose data of the bits.
3) Repeating the step 2) to obtain the relative P of all the points to be tested 0 Bit deformation data.
Example two
As shown in FIG. 3, when a camera in a single camera measurement station cannot shoot and cover measurement reference points and all points to be measured, the visual measurement method for the large-span discrete point relative deformation of the movable base specifically comprises the following steps:
step 1: determining a measurement reference point P 0 Point to be measured P ci Auxiliary measuring point position P pi Code mark points are densely distributed on the measurement reference point, the point to be measured and the auxiliary measurement point respectively, the code mark points are distributed in a circular area range with the radius of 0.5-1m by taking each point as the center of a circle, and the code mark points are subjected to stretching deformation according to the change of shooting distance and shooting angle; the camera measurement stations are arranged on the movable base, and each camera measurement station at least comprises two cameras which are fixedly connected; each camera is used for shooting all coding mark points at the measurement reference point position or the point to be measured independently; the camera of the measuring station for shooting the measuring reference point or the point to be measured close to the measuring reference point is a reference camera, wherein the measuring station for shooting the measuring reference point is a reference measuring station for shooting.
Step 2: performing camera internal parameter calibration to obtain internal azimuth elements and lens distortion correction parameters of all cameras;
step 3: arranging each camera external parameter calibration control point, and obtaining coordinates of the camera external parameter calibration control points and all coding mark points under an object space coordinate system by a coordinate measuring device, wherein the object space coordinate system is a space reference coordinate system (appointed by a measurement demand proposer) of a platform where the measurement reference point and the point to be measured are located;
step 4: collecting external reference calibration images of each camera, and performing camera external reference calibration processing to obtain initial external orientation elements of all cameras, wherein the initial external orientation elements comprise three angle elements and three line elements (X S ,Y S ,Z S );
Step 5: calculating a space pose conversion matrix of other cameras of each shooting measurement station relative to the reference camera based on the initial external azimuth element of each camera;
the calculation model is shown as formula (1):
wherein T is 0 And R is 0 Pose matrix formed by initial external azimuth elements of reference camera of certain camera measuring station, T i And R is i Pose matrix formed by initial external azimuth elements of other cameras of the camera measuring station, T 0-i And R is 0-i The pose conversion matrix of other cameras of the shooting and measuring station relative to the reference camera is provided, and n is the number of other cameras of the shooting and measuring station;
step 6: synchronously acquiring image data to obtain the measurement datum point P at the same moment in dynamic test 0 Bit, and bit under test P ci An image of the encoded marker point at;
step 7: the coding mark points are detected, positioned, identified and registered in real time, mark point coding information and image coordinates thereof in the view field of the corresponding camera are obtained, and a one-to-one correspondence relation between the image coordinates of the coding mark points detected in real time and the coordinates of the object coordinate system obtained in the step 3 is established;
step 8: calculating dynamic external azimuth elements of a reference camera of the reference camera measuring station; according to shooting P 0 Internal azimuth element and lens distortion correction coefficient, P of position reference camera 0 Image coordinates of the coded mark points and corresponding coordinates in the object coordinate system, and shooting P is obtained based on the principle of space rear intersection 0 A dynamic external azimuth element of the position reference camera;
step 9: calculating a pose matrix formed by other camera dynamic external azimuth elements of the camera measuring station in the previous step; according to the pose conversion matrix of the other cameras of the photographing measuring station relative to the reference camera obtained by calculation in the step 5 and the dynamic external azimuth element of the reference camera obtained by calculation in the step 5, obtaining the dynamic pose matrix of the other cameras of the photographing measuring station;
the dynamic pose matrix calculation model of other cameras of the camera measuring station is as follows:
wherein T' 0 ,R′ 0 Pose matrix T formed by dynamic external azimuth elements of reference camera of camera measuring station A 0-i And R is 0-i The pose conversion matrix of other cameras of the shooting measuring station relative to the reference camera is used for the pose conversion matrix; t'. i ,R′ i And (5) dynamically positioning the pose matrix for other cameras of the camera measuring station.
Relationship between pose matrix and camera external orientation element:
step 10: calculating the point to be measured P shot by other cameras of the shooting measuring station in the previous step ci Relative to the reference point P 0 Pose of the position; according to the internal azimuth element, lens distortion correction coefficient, dynamic pose matrix and P of other cameras of the camera measuring station in the previous step ci The coordinates of the coded mark point image and the coordinates under the object coordinate system thereof are based on the single-station 6D resolving principle to obtain P ci Relative P 0 Pose of position (phi, omega, K, deltaX, deltaY, deltaZ) i
The specific calculation process comprises the following steps:
1) Constructing an equivalent relation:
wherein,the image coordinates of the coded mark point image coordinates after lens distortion error correction are represented by the image plane coordinates, f is the focal length of the camera lens, lambda is the scale coefficient, and can be eliminated by formula conversion, (X) t ,Y t ,Z t ) To code the coordinates of the marker points in the object coordinate system, T Pi ,R Pi Is P ci Bit relative P 0 And a translation and rotation matrix formed by pose parameters under an object space coordinate system.
2) Calculating single point to be measured
Column error equation
And respectively solving partial derivatives based on error equation pairs (phi, omega, K, delta X, delta Y and delta Z) to obtain an error equation:
combining error equations of the coding mark point shape as shown in the formula (5) at each point to be detected to form an error equation set;
calculating least square adjustment to obtain the relative P of each point to be measured 0 Pose data of the bits.
3) Repeating step 2) to obtainObtaining the relative P of all the points to be measured of the camera measuring station A 0 Pose data of the bits.
Step 11: calculating the point position P to be measured in the last step ci A new coordinate of the marking point under the object coordinate system is coded;
the specific calculation process comprises the following steps:
based on the formula (7), the point to be measured P is obtained ci New coordinates of point code mark under object coordinate system
Step 12: calculating a dynamic pose matrix of a reference camera in an adjacent camera measurement station; obtaining a dynamic pose matrix of a reference camera of the photographing measuring station based on a rear intersection principle according to internal azimuth elements and lens distortion correction coefficients of the reference cameras in the adjacent photographing measuring stations, image coordinates of the coding mark points at the point to be measured in the previous step in the imaging of the camera and new coordinates under an object coordinate system obtained by calculation in the previous step;
step 13: judging whether the to-be-measured point shot by other cameras of the camera measuring station in the previous step has the last to-be-measured point or not; if yes, repeating the step 9-10, and ending the calculation at the current moment; if not, repeating the steps 9-13 until all the relative P of the points to be detected are completed 0 The pose calculation of (2) is realized, namely the large-span discrete point relative deformation measurement at the current moment is realized.
And 14, entering synchronous acquisition of image data at the next moment and measurement and calculation of relative deformation until the test is finished.
The above embodiments are only for illustrating the technical concept and features of the present invention, and are intended to enable those skilled in the art to understand the content of the present invention and implement it accordingly, and are not intended to limit the scope of the present invention, but all equivalent changes or modifications made according to the spirit of the present invention should be included in the scope of the present invention. The technology, shape, and construction parts of the present invention, which are not described in detail, are known in the art.

Claims (8)

1. The visual measurement method for the large-span discrete point relative deformation of the movable base is characterized by comprising the following steps of:
step 1: determining a measurement reference point and a point to be measured, and densely distributing coding mark points at the measurement reference point and the point to be measured respectively; the camera measuring stations are arranged on the movable base, each camera measuring station at least comprises two cameras, one of the cameras is a reference camera, the reference camera is a camera for shooting a measurement reference point position in each camera measuring station, or a camera for shooting a point to be measured relatively close to the measurement reference point position, and the camera measuring station comprising the camera for shooting the measurement reference point position is recorded as a reference camera measuring station; each camera is used for shooting all coding mark points at the measurement reference point position or the point to be measured independently;
step 2: performing camera internal parameter calibration to obtain internal azimuth elements and lens distortion correction parameters of all cameras;
step 3: arranging each camera external parameter calibration control point, and obtaining coordinates of the camera external parameter calibration control points and all coding mark points under an object coordinate system through a coordinate measuring device;
step 4: collecting external parameter calibration images of each camera, and performing camera external parameter calibration processing to obtain initial external azimuth elements of all cameras;
step 5: calculating a space pose conversion matrix of other cameras of each shooting measurement station relative to the reference camera based on the initial external azimuth element of each camera;
step 6: synchronously acquiring image data to obtain images of the measurement reference point positions and the coding mark points at the positions to be measured at the same moment in a dynamic test;
step 7: the coding mark points are detected, positioned, identified and registered in real time, mark point coding information and image coordinates thereof in the view field of the corresponding camera are obtained, and a one-to-one correspondence relation between the image coordinates of the coding mark points detected in real time and the coordinates of the object coordinate system obtained in the step 3 is established;
step 8: calculating dynamic external azimuth elements of a reference camera of the reference camera measuring station;
step 9: calculating a pose matrix formed by other camera dynamic external azimuth elements of the camera measuring station in the previous step;
step 10: calculating the pose of the point to be measured shot by other cameras of the shooting measuring station in the previous step relative to the measuring reference point;
step 11: calculating a new coordinate of the coding mark point at the point to be detected in the last step under the object coordinate system;
step 12: calculating a dynamic pose matrix of a reference camera in an adjacent camera measurement station;
step 13: judging whether the to-be-measured point shot by other cameras in the camera measuring station in the last step has the last to-be-measured point or not; if yes, repeating the step 9-10, and ending the calculation at the current moment; if not, repeating the steps 9-13;
and 14, entering synchronous acquisition and relative deformation measurement calculation of image data at the next moment, namely repeating the steps 6-13 until the test is finished.
2. The visual measurement method of large-span discrete point relative deformation of a moving base according to claim 1, wherein in the step 1, when the camera in a single camera measurement station cannot shoot and cover the measurement reference point and all the points to be measured, a plurality of camera measurement stations are arranged.
3. The visual measurement method for large-span discrete point relative deformation of a movable base according to claim 2, wherein when two adjacent points are shot by one shooting measurement station under the condition that the shooting view field range of each camera is determined, when the imaging area of the coding mark point of the position of the two adjacent points in the corresponding shooting camera is smaller than 1/3 of the view field range, an auxiliary measurement point is required to be added between the two points, and the auxiliary measurement point is equivalent to the point to be measured in the calculation process.
4. The visual measurement method for the large-span discrete point relative deformation of the movable base according to claim 1, wherein the coding mark points are arranged in a circular area with each point as a center and a radius of 0.5-1 m.
5. The visual measurement method for the large-span discrete point relative deformation of the movable base according to claim 5, wherein the coded mark points are subjected to stretching deformation according to the change of the photographing distance and the photographing angle.
6. The visual measurement method of large-span discrete point relative deformation of a moving base according to claim 1, wherein the step 5 specifically comprises the following steps:
the calculation model of other cameras of the camera measuring station relative to the reference camera is shown as (1):
wherein T is 0 And R is 0 Pose matrix formed by initial external azimuth elements of reference camera of certain camera measuring station, T i And R is i Pose matrix formed by initial external azimuth elements of other cameras of the camera measuring station, T 0-i And R is 0-i And (3) converting the pose of the other cameras of the image pickup measuring station relative to the reference camera into a matrix, wherein n is the number of the other cameras of the image pickup measuring station.
7. The visual measurement method of large-span discrete point relative deformation of a moving base according to claim 6, wherein said step 9 comprises the steps of:
according to the pose conversion matrix of the other cameras of the photographing measuring station relative to the reference camera obtained by calculation in the step 5 and the dynamic external azimuth element of the reference camera obtained by calculation in the step 5, obtaining the dynamic pose matrix of the other cameras of the station;
the dynamic pose matrix calculation model of other cameras of the camera measuring station is as follows:
wherein T' 0 ,R′ 0 Pose matrix T formed by dynamic external azimuth elements of reference camera of camera measuring station A 0-i And R is 0-i The pose conversion matrix of other cameras of the shooting measuring station relative to the reference camera is used for the pose conversion matrix; t'. i ,R′ i And (5) dynamically positioning the pose matrix for other cameras of the camera measuring station.
8. The visual measurement method of large-span discrete point relative deformation of a moving base according to claim 7, wherein said step 10 comprises the steps of:
constructing an error equation based on the formula (3), and carrying out least square adjustment calculation to obtain pose data of the point to be measured relative to the measurement reference point;
wherein,the image coordinates of the coded mark point image coordinates after lens distortion error correction are represented by the image plane coordinates, f is the focal length of the camera lens, lambda is the scale coefficient, and can be eliminated by formula conversion, (X) t ,Y t ,Z t ) T is the coordinate of the mark point in the object coordinate system Pi ,R Pi A translation and rotation matrix formed by pose parameters of the point to be measured relative to the measurement reference point in an object coordinate system is formed, (X) P ,Y P ,Z P ) And (X) Pt ,Y Pt ,Z Pt ) Image space auxiliary seat obtained by multiplying image space coordinates and object space coordinates by rotation matrixThe coordinates are spatially assisted by the object.
CN202311767061.6A 2023-12-20 2023-12-20 Visual measurement method for large-span discrete point relative deformation of movable base Pending CN117722972A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311767061.6A CN117722972A (en) 2023-12-20 2023-12-20 Visual measurement method for large-span discrete point relative deformation of movable base

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311767061.6A CN117722972A (en) 2023-12-20 2023-12-20 Visual measurement method for large-span discrete point relative deformation of movable base

Publications (1)

Publication Number Publication Date
CN117722972A true CN117722972A (en) 2024-03-19

Family

ID=90210455

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311767061.6A Pending CN117722972A (en) 2023-12-20 2023-12-20 Visual measurement method for large-span discrete point relative deformation of movable base

Country Status (1)

Country Link
CN (1) CN117722972A (en)

Similar Documents

Publication Publication Date Title
CN111563878B (en) Space target positioning method
CN110081881B (en) Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology
CN110456330B (en) Method and system for automatically calibrating external parameter without target between camera and laser radar
Boochs et al. Increasing the accuracy of untaught robot positions by means of a multi-camera system
CN101539397B (en) Method for measuring three-dimensional attitude of object on precision-optical basis
CN109631911B (en) Satellite attitude rotation information determination method based on deep learning target recognition algorithm
CN107300382B (en) Monocular vision positioning method for underwater robot
CN112629431B (en) Civil structure deformation monitoring method and related equipment
CN111220120B (en) Moving platform binocular ranging self-calibration method and device
CN107941212B (en) Vision and inertia combined positioning method
CN113627473A (en) Water surface unmanned ship environment information fusion sensing method based on multi-mode sensor
CN113313659B (en) High-precision image stitching method under multi-machine cooperative constraint
CN111915685B (en) Zoom camera calibration method
CN111583342B (en) Target rapid positioning method and device based on binocular vision
CN111524174A (en) Binocular vision three-dimensional construction method for moving target of moving platform
CN113724337A (en) Camera dynamic external parameter calibration method and device without depending on holder angle
CN113295171B (en) Monocular vision-based attitude estimation method for rotating rigid body spacecraft
CN108225305A (en) A kind of star sensor multi-parameters optimization method and system
CN117722972A (en) Visual measurement method for large-span discrete point relative deformation of movable base
CN111145267A (en) IMU (inertial measurement unit) assistance-based 360-degree panoramic view multi-camera calibration method
CN107992677B (en) Infrared weak and small moving target tracking method based on inertial navigation information and brightness correction
CN108592862B (en) AHRS installation deflection angle measuring method
CN114910241B (en) Wind tunnel coordinate system conversion method for wind tunnel model attitude measurement
CN113503830B (en) Aspheric surface shape measuring method based on multiple cameras
CN116664698B (en) Automatic calibration method for vehicle-mounted binocular camera and GNSS/IMU

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination