CN113610763A - Rocket engine structural member pose motion compensation method in vibration environment - Google Patents
Rocket engine structural member pose motion compensation method in vibration environment Download PDFInfo
- Publication number
- CN113610763A CN113610763A CN202110776175.1A CN202110776175A CN113610763A CN 113610763 A CN113610763 A CN 113610763A CN 202110776175 A CN202110776175 A CN 202110776175A CN 113610763 A CN113610763 A CN 113610763A
- Authority
- CN
- China
- Prior art keywords
- camera
- rocket engine
- pose
- motion
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Quality & Reliability (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention relates to a rocket engine structural member pose motion compensation method in a vibration environment, which is characterized by comprising the following steps of: s1: pasting a mark point; s2: calibrating a camera; s3: collecting an image; s4: extracting and matching feature points; s5: and (5) image resolving. The compensation method of the invention is that when the rocket engine test pose vision measurement is carried out, the motion compensation is carried out on the shot image, the relative displacement relation between different frames is found out, and the offset is compensated. The method improves the pose resolving precision of the rocket engine structural member in the impact vibration environment of the pose vision measuring system.
Description
Technical Field
The invention relates to a rocket engine structural member pose motion compensation method in a vibration environment, which is suitable for the technical field of rocket engine test run pose precision measurement.
Background
The rocket engine is used as a power device of a rocket and a missile weapon, the performance of the rocket engine is directly related to the performance of the rocket and the missile weapon, the ignition test and the test technology of the rocket engine play a significant role in the development of the rocket, are important components of the rocket technology, and are a main test mode and indispensable working links in various stages of pre-research, model, initial sample, batch production and the like of the rocket engine.
The ignition test of the rocket engine often produces impact and vibration, and great influence is produced to the body structural component of the rocket engine. In order to evaluate the reasonability of the design of the body structural part of the rocket engine, the pose of the structural part in the test process needs to be measured in real time. Generally speaking, for dynamic pose measurement, a photogrammetric method for online measurement by a plurality of cameras can be adopted, a feature identifier is preset on a rocket engine structural member, the feature identifier is made of a special directional reflecting material and is used for carrying out image acquisition on a section with the identifier, and the relative three-dimensional space position attitude relationship among different sections is obtained through key steps of image precision processing, feature matching, binocular vision calculation, three-dimensional position calculation and the like.
However, in the test run of the rocket engine, the camera is affected by the impact vibration, so that the relative motion between the camera and the shooting object is caused, the content of the adjacent frames is shifted, and the shift affects the measurement accuracy.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects and the requirements in the prior art, the invention provides a rocket engine structural member pose motion compensation method in a vibration environment, which is used for performing motion compensation on a shot image during rocket engine trial run test pose visual measurement, finding out a relative displacement relation among different frames and compensating offset. The method improves the pose resolving precision of the rocket engine structural member in the impact vibration environment of the pose vision measuring system.
(II) technical scheme
A rocket engine structural member pose motion compensation method in a vibration environment comprises the following steps:
s1: pasting a mark point;
s2: calibrating a camera;
s3: collecting an image;
s4: extracting and matching feature points;
s5: and (5) image resolving.
Step S1 specifically includes: before the test of the rocket engine, code mark points are pasted on each structural part of the rocket engine, and a plurality of code mark points are pasted on the relative static position of the test background so as to obtain the motion pose of the camera.
Step S2 specifically includes: and calibrating the camera by adopting the calibration plate, and shooting 9 image pairs of the calibration plate from different positions and directions to obtain an internal parameter matrix and an external parameter matrix of the camera.
Step S3 specifically includes: and starting a test run of the rocket engine, and synchronously shooting by using the camera to obtain image data.
Step S4 specifically includes: in order to solve the camera motion parameters, it is necessary to obtain pairs of feature points between different frames.
Step S5 specifically includes: resolving the pose of each structural part of the rocket engine, and resolving the feature points of the relative static position of the test background; and compensating the feature point calculation matrix of the relative static position to a pose calculation result of a corresponding structural member of the corresponding frame frequency image.
Step S2 further includes a step of establishing a camera motion model, specifically:
considering the relative motion between the test background and the camera as the motion of the imaging plane under the static condition of the test background, the projection of any point Q in the three-dimensional space on the homogeneous coordinate system plane can be expressed as the following formulas (1) and (2):
in the above formula (1) and the above formula (2), X, Y, Z represents the coordinate value of the point Q in the homogeneous coordinate system, and x and y represent the coordinates of the point Q in the homogeneous coordinate systemThe value of the coordinate value after projection; s is a scaling factor; r, t are respectively the rotation matrix and translation vector from the world coordinate system to the camera coordinate system, which are the extrinsic parameters of the camera; m is an internal parameter matrix of the camera; f. ofx、fyThe focal lengths of the cameras in the x direction and the y direction are respectively; cx、CyThe coordinates of the image center of the camera in the x direction and the y direction are respectively, and the image center is the intersection point of the optical axis and the image plane.
The step of establishing the camera motion model further comprises the following steps:
by setting the world coordinate system, the test background plane is defined at Z ═ 0, so that the rotation matrix R is decomposed into 3 × 1 vectors, i.e., R ═ R (R ═ 0)1,r2,r3) Wherein r is1、r2、r3The first vector, the second vector, and the third vector of the rotation matrix R, respectively, can be expressed as the following formula (3) because one column vector is no longer needed:
the homography matrix from any point Q in the three-dimensional space to the imaging plane is H ═ sM [ r [ r ] ]1,r2,t]The image background distortion caused by the relative motion between the camera and the object to be photographed can be described by a perspective model of 8 parameters.
Step S4 specifically includes:
in order to solve the motion parameters of the camera, feature point pairs between different frames need to be obtained, a circular target point with rotation invariance and strong noise resistance is used as a feature point, the circular target point is projected onto an imaging plane through a lens to become an elliptical feature point, an ellipse fitting method is used for fitting the edge of the elliptical feature point in the image, and then the central coordinate of the elliptical feature point is solved, and in a two-dimensional rectangular coordinate system, an elliptical equation is as follows:
f(p,m)=ax2+bxy+cy2+dx+ey+f=0 (4)
wherein p and m are ellipse equation parameters, wherein p ═ a,b,c,d,e,f],m=[x2,xy,y2,x,y,1]And a, b, c, d, e and f are coefficients of terms of a binary quadratic elliptic equation.
Further, a constraint | p | ═ 1 is introduced, p is unitized, and a cost function f (p) is constructed:
in the formula, M is a penalty factor, and N is the number of points participating in ellipse fitting;
therefore, the solution problem of the elliptic equation becomes the solution problem of the minimum value of the cost function F (p), the Newton method is adopted to solve p, and then the central coordinate (u) of the elliptic feature point is solved through the following formula (6) according to the solved parameter p0,v0):
Wherein (u)0,v0) Is the center coordinate of the feature point of the ellipse.
(III) advantageous effects
The invention discloses a rocket engine structural member pose motion compensation method in a vibration environment, which is used for performing motion compensation on a shot image during rocket engine test pose visual measurement, finding out a relative displacement relation among different frames and compensating offset. The method improves the pose resolving precision of the rocket engine structural member in the impact vibration environment of the pose vision measuring system.
Detailed Description
The invention discloses a rocket engine structural member pose motion compensation method in a vibration environment, which comprises the following steps:
s1: pasting the mark points:
before a test run of the rocket engine, pasting coding mark points on each structural part of the rocket engine, and pasting a plurality of coding mark points on the relative static position of a test background to acquire the motion pose of the camera;
s2: calibrating a camera:
calibrating the camera by adopting a calibration plate, and shooting 9 images of the calibration plate from different positions and directions to obtain an internal parameter matrix and an external parameter matrix of the camera;
s3: image acquisition:
starting a test run of the rocket engine, and synchronously shooting by using a camera to obtain image data;
s4: extracting and matching feature points:
in order to solve the motion parameters of the camera, feature point pairs between different frames need to be obtained;
s5: image resolving:
resolving the pose of each structural part of the rocket engine, and resolving the feature points of the relative static position of the test background; and compensating the feature point calculation matrix of the relative static position to a pose calculation result of a corresponding structural member of the corresponding frame frequency image.
The step S2 further includes a step of establishing a camera motion model, specifically:
considering the relative motion between the test background and the camera as the motion of the imaging plane under the static condition of the test background, the projection of any point Q in the three-dimensional space on the homogeneous coordinate system plane can be expressed as the following formulas (1) and (2):
in the above formula (1) and the above formula (2), X, Y, Z represents the coordinate value of the point Q in the homogeneous coordinate system, and x and y represent the coordinate values after the projection of the coordinate values of the point Q in the homogeneous coordinate system; s is a scaling factor; r, t are respectively the rotation matrix and translation vector from the world coordinate system to the camera coordinate system, which are the extrinsic parameters of the camera; m is an internal parameter matrix of the camera; f. ofx、fyThe focal lengths of the cameras in the x direction and the y direction are respectively; cx、CyThe coordinates of the image center of the camera in the x direction and the y direction are respectively, and the image center is the intersection point of the optical axis and the image plane.
Further, the step of establishing a camera motion model further comprises:
by setting the world coordinate system, the test background plane is defined at Z ═ 0, so that the rotation matrix R is decomposed into 3 × 1 vectors, i.e., R ═ R (R ═ 0)1,r2,r3) Wherein r is1、r2、r3The first vector, the second vector, and the third vector of the rotation matrix R, respectively, can be expressed as the following formula (3) because one column vector is no longer needed:
the homography matrix from any point Q in the three-dimensional space to the imaging plane is H ═ sM [ r [ r ] ]1,r2,t]The image background distortion caused by the relative motion between the camera and the object to be photographed can be described by a perspective model of 8 parameters.
And (3) establishing an equation set containing 8 uncorrelated equations by substituting 4 background corresponding characteristic point pairs by adopting a background registration method and solving to obtain the homography matrix H.
The step S4 specifically includes:
in order to solve the motion parameters of the camera, feature point pairs between different frames need to be obtained, a circular target point with rotation invariance and strong noise resistance is used as a feature point, the circular target point is projected onto an imaging plane through a lens to become an elliptical feature point, an ellipse fitting method is used for fitting the edge of the elliptical feature point in the image, and then the central coordinate of the elliptical feature point is solved, and in a two-dimensional rectangular coordinate system, an elliptical equation is as follows:
f(p,m)=ax2+bxy+cy2+dx+ey+f=0 (4)
in the formula, p and m are ellipse equation parametersA, wherein p ═ a, b, c, d, e, f],m=[x2,xy,y2,x,y,1]And a, b, c, d, e and f are coefficients of terms of a binary quadratic elliptic equation.
Further, a constraint | p | ═ 1 is introduced, p is unitized, and a cost function f (p) is constructed:
where M is a penalty factor and N is the number of points involved in the ellipse fitting.
Therefore, the solution problem of the elliptic equation becomes the solution problem of the minimum value of the cost function F (p), the Newton method is adopted to solve p, and then the central coordinate (u) of the elliptic feature point is solved through the following formula (6) according to the solved parameter p0,v0):
Wherein (u)0,v0) Is the center coordinate of the feature point of the ellipse.
During vision measurement, images of the feature points are obtained through real-time acquisition of the camera, and pose measurement of different positions of the engine is achieved. The vision measurement system needs to consider precision and measure effectiveness, so a quick matching method based on position constraint is adopted.
Claims (9)
1. A rocket engine structural member pose motion compensation method in a vibration environment is characterized by comprising the following steps:
s1: pasting a mark point;
s2: calibrating a camera;
s3: collecting an image;
s4: extracting and matching feature points;
s5: and (5) image resolving.
2. The method for compensating the pose motion of the structural member of the rocket engine in the vibration environment according to claim 1, wherein the step S1 is specifically as follows: before the test of the rocket engine, code mark points are pasted on each structural part of the rocket engine, and a plurality of code mark points are pasted on the relative static position of the test background so as to obtain the motion pose of the camera.
3. The method for compensating the pose motion of the structural member of the rocket engine in the vibration environment as recited in claim 2, wherein the step S2 is specifically as follows: and calibrating the camera by adopting the calibration plate, and shooting 9 image pairs of the calibration plate from different positions and directions to obtain an internal parameter matrix and an external parameter matrix of the camera.
4. The method for compensating the pose motion of the structural member of the rocket engine in the vibration environment according to claim 3, wherein the step S3 is specifically as follows: and starting a test run of the rocket engine, and synchronously shooting by using the camera to obtain image data.
5. The method for compensating the pose motion of the structural member of the rocket engine in the vibration environment according to claim 4, wherein the step S4 is specifically as follows: in order to solve the camera motion parameters, it is necessary to obtain pairs of feature points between different frames.
6. The method for compensating the pose motion of the structural member of the rocket engine in the vibration environment according to claim 5, wherein the step S5 is specifically as follows: resolving the pose of each structural part of the rocket engine, and resolving the feature points of the relative static position of the test background; and compensating the feature point calculation matrix of the relative static position to a pose calculation result of a corresponding structural member of the corresponding frame frequency image.
7. A rocket engine structure pose motion compensation method under a vibration environment as recited in claim 2, wherein step S2 further comprises the step of establishing a camera motion model, specifically:
considering the relative motion between the test background and the camera as the motion of the imaging plane under the static condition of the test background, the projection of any point Q in the three-dimensional space on the homogeneous coordinate system plane can be expressed as the following formulas (1) and (2):
in the above formula (1) and the above formula (2), X, Y, Z represents the coordinate value of the point Q in the homogeneous coordinate system, and x and y represent the coordinate values after the projection of the coordinate values of the point Q in the homogeneous coordinate system; s is a scaling factor; r, t are respectively the rotation matrix and translation vector from the world coordinate system to the camera coordinate system, which are the extrinsic parameters of the camera; m is an internal parameter matrix of the camera; f. ofx、fyThe focal lengths of the cameras in the x direction and the y direction are respectively; cx、CyThe coordinates of the image center of the camera in the x direction and the y direction are respectively, and the image center is the intersection point of the optical axis and the image plane.
8. A rocket engine structure pose motion compensation method under a vibratory environment as recited in claim 7, wherein the step of modeling the motion of the camera further comprises:
by setting the world coordinate system, the test background plane is defined at Z ═ 0, so that the rotation matrix R is decomposed into 3 × 1 vectors, i.e., R ═ R (R ═ 0)1,r2,r3) Wherein r is1、r2、r3The first vector, the second vector, and the third vector of the rotation matrix R, respectively, can be expressed as the following formula (3) because one column vector is no longer needed:
the homography matrix from any point Q in the three-dimensional space to the imaging plane is H ═ sM [ r [ r ] ]1,r2,t]The image background distortion caused by the relative motion between the camera and the object to be photographed can be described by a perspective model of 8 parameters.
9. A rocket engine structure position and posture motion compensation method under vibration environment as claimed in claim 5, wherein step S4 specifically includes:
in order to solve the motion parameters of the camera, feature point pairs between different frames need to be obtained, a circular target point with rotation invariance and strong noise resistance is used as a feature point, the circular target point is projected onto an imaging plane through a lens to become an elliptical feature point, an ellipse fitting method is used for fitting the edge of the elliptical feature point in the image, and then the central coordinate of the elliptical feature point is solved, and in a two-dimensional rectangular coordinate system, an elliptical equation is as follows:
f(p,m)=ax2+bxy+cy2+dx+ey+f=0 (4)
wherein p and m are ellipse equation parameters, wherein p ═ a, b, c, d, e, f],m=[x2,xy,y2,x,y,1]And a, b, c, d, e and f are coefficients of terms of a binary quadratic elliptic equation.
Further, a constraint | p | ═ 1 is introduced, p is unitized, and a cost function f (p) is constructed:
in the formula, M is a penalty factor, and N is the number of points participating in ellipse fitting;
therefore, the solution problem of the elliptic equation becomes the solution problem of the minimum value of the cost function F (p), the Newton method is adopted to solve p, and then the central coordinate (u) of the elliptic feature point is solved through the following formula (6) according to the solved parameter p0,v0):
Wherein (u)0,v0) Is the center coordinate of the feature point of the ellipse.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110776175.1A CN113610763A (en) | 2021-07-09 | 2021-07-09 | Rocket engine structural member pose motion compensation method in vibration environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110776175.1A CN113610763A (en) | 2021-07-09 | 2021-07-09 | Rocket engine structural member pose motion compensation method in vibration environment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113610763A true CN113610763A (en) | 2021-11-05 |
Family
ID=78304278
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110776175.1A Pending CN113610763A (en) | 2021-07-09 | 2021-07-09 | Rocket engine structural member pose motion compensation method in vibration environment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113610763A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114062265A (en) * | 2021-11-11 | 2022-02-18 | 易思维(杭州)科技有限公司 | Method for evaluating stability of supporting structure of visual system |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101311963A (en) * | 2008-06-17 | 2008-11-26 | 东南大学 | Round mark point center picture projection point position acquiring method for positioning video camera |
CN104596502A (en) * | 2015-01-23 | 2015-05-06 | 浙江大学 | Object posture measuring method based on CAD model and monocular vision |
CN106485757A (en) * | 2016-10-13 | 2017-03-08 | 哈尔滨工业大学 | A kind of Camera Calibration of Stereo Vision System platform based on filled circles scaling board and scaling method |
CN109238235A (en) * | 2018-06-29 | 2019-01-18 | 华南农业大学 | Monocular sequence image realizes rigid body pose parameter continuity measurement method |
CN109405835A (en) * | 2017-08-31 | 2019-03-01 | 北京航空航天大学 | Relative pose measurement method based on noncooperative target straight line and circle monocular image |
CN110443854A (en) * | 2019-08-05 | 2019-11-12 | 兰州交通大学 | Based on fixed target without relative pose scaling method between public view field camera |
CN111462135A (en) * | 2020-03-31 | 2020-07-28 | 华东理工大学 | Semantic mapping method based on visual S L AM and two-dimensional semantic segmentation |
CN112504121A (en) * | 2020-12-02 | 2021-03-16 | 西安航天动力研究所 | System and method for monitoring structural attitude of high-thrust rocket engine |
-
2021
- 2021-07-09 CN CN202110776175.1A patent/CN113610763A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101311963A (en) * | 2008-06-17 | 2008-11-26 | 东南大学 | Round mark point center picture projection point position acquiring method for positioning video camera |
CN104596502A (en) * | 2015-01-23 | 2015-05-06 | 浙江大学 | Object posture measuring method based on CAD model and monocular vision |
CN106485757A (en) * | 2016-10-13 | 2017-03-08 | 哈尔滨工业大学 | A kind of Camera Calibration of Stereo Vision System platform based on filled circles scaling board and scaling method |
CN109405835A (en) * | 2017-08-31 | 2019-03-01 | 北京航空航天大学 | Relative pose measurement method based on noncooperative target straight line and circle monocular image |
CN109238235A (en) * | 2018-06-29 | 2019-01-18 | 华南农业大学 | Monocular sequence image realizes rigid body pose parameter continuity measurement method |
CN110443854A (en) * | 2019-08-05 | 2019-11-12 | 兰州交通大学 | Based on fixed target without relative pose scaling method between public view field camera |
CN111462135A (en) * | 2020-03-31 | 2020-07-28 | 华东理工大学 | Semantic mapping method based on visual S L AM and two-dimensional semantic segmentation |
CN112504121A (en) * | 2020-12-02 | 2021-03-16 | 西安航天动力研究所 | System and method for monitoring structural attitude of high-thrust rocket engine |
Non-Patent Citations (1)
Title |
---|
李蒙蒙: ""基于双目视觉的火箭喷管运动姿态测量系统研究"", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114062265A (en) * | 2021-11-11 | 2022-02-18 | 易思维(杭州)科技有限公司 | Method for evaluating stability of supporting structure of visual system |
CN114062265B (en) * | 2021-11-11 | 2023-06-30 | 易思维(杭州)科技有限公司 | Evaluation method for stability of support structure of vision system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107492127B (en) | Light field camera parameter calibration method and device, storage medium and computer equipment | |
Weng et al. | Calibration of stereo cameras using a non-linear distortion model (CCD sensory) | |
CN110099267B (en) | Trapezoidal correction system, method and projector | |
CN110264528B (en) | Rapid self-calibration method for binocular camera with fish-eye lens | |
Chatterjee et al. | Algorithms for coplanar camera calibration | |
CN110189400B (en) | Three-dimensional reconstruction method, three-dimensional reconstruction system, mobile terminal and storage device | |
CN107633533B (en) | High-precision circular mark point center positioning method and device under large-distortion lens | |
CN114494464A (en) | Calibration method of line scanning camera | |
CN112792814B (en) | Mechanical arm zero calibration method based on visual marks | |
CN115861445B (en) | Hand-eye calibration method based on three-dimensional point cloud of calibration plate | |
CN110766759B (en) | Multi-camera calibration method and device without overlapped view fields | |
CN113610763A (en) | Rocket engine structural member pose motion compensation method in vibration environment | |
JP2008131176A (en) | Camera calibration device | |
CN113870366A (en) | Calibration method and calibration system of three-dimensional scanning system based on pose sensor | |
CN116681772A (en) | Multi-camera online calibration method under non-common view | |
CN110310243B (en) | Unmanned aerial vehicle photogrammetry image correction method, system and storage medium | |
CN108955642B (en) | Large-breadth equivalent center projection image seamless splicing method | |
CN110139094A (en) | A kind of optical center alignment schemes, optical center Coordinate calculation method and device | |
Von Gioi et al. | Lens distortion correction with a calibration harp | |
JP2006098065A (en) | Calibration device and method, and three-dimensional modelling device and system capable of using the same | |
CN111652945A (en) | Camera calibration method | |
CN112556657B (en) | Multi-view vision measurement system for flight motion parameters of separating body in vacuum environment | |
CN115719387A (en) | 3D camera calibration method, point cloud image acquisition method and camera calibration system | |
CN112529969B (en) | XY axis positioning compensation method of chip mounter | |
CN113870364B (en) | Self-adaptive binocular camera calibration method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |