CN110866954B - Method for measuring high-precision attitude of bullet target under length constraint - Google Patents

Method for measuring high-precision attitude of bullet target under length constraint Download PDF

Info

Publication number
CN110866954B
CN110866954B CN201911106730.9A CN201911106730A CN110866954B CN 110866954 B CN110866954 B CN 110866954B CN 201911106730 A CN201911106730 A CN 201911106730A CN 110866954 B CN110866954 B CN 110866954B
Authority
CN
China
Prior art keywords
tail
head
missile
target
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911106730.9A
Other languages
Chinese (zh)
Other versions
CN110866954A (en
Inventor
杨夏
郭贵松
宁丞浩
甘叔玮
叶雪辀
张小虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
Original Assignee
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sun Yat Sen University filed Critical Sun Yat Sen University
Priority to CN201911106730.9A priority Critical patent/CN110866954B/en
Publication of CN110866954A publication Critical patent/CN110866954A/en
Application granted granted Critical
Publication of CN110866954B publication Critical patent/CN110866954B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The invention discloses a method for measuring the height precision attitude of a bullet target under length constraint. Shooting the missile target in the target hitting process by using a binocular stereoscopic vision convergence measuring system, obtaining an image of the missile target when hitting the target, and obtaining the world coordinate of the missile target. The problem that the precision of an image measuring system along the projectile body direction is insufficient is solved by using the known length from the projectile head to the tail of the missile target, and a three-dimensional high-precision measuring result is obtained, so that the high-precision projectile body posture is obtained.

Description

Method for measuring high-precision attitude of bullet target under length constraint
Technical Field
The invention belongs to the technical field of projectile attitude measurement, and particularly relates to a method for measuring the high-precision attitude of a projectile under length constraint.
Background
The posture of the target has important influence on the damage effect when the target is shot, and the high-precision measurement of the posture of the target when the target is shot is an important parameter index for evaluating and analyzing the damage effect of the bullet. But the target landing process is a highly dynamic process and can not be used for contact measurement or non-contact close-distance station distribution measurement due to the destructive effect of the target landing process.
The radar method is a well-developed target detection method, but the radar method has the advantages of large system, complex technology and high price, is more suitable for detecting and tracking over-the-horizon large targets, has lower absolute accuracy for detecting short-distance high-speed small targets, and is not suitable for measuring the posture and the speed of bullets before landing.
The high-speed camera measurement is used for shooting moving target images at different moments to form a series of sequence images, the sequence images not only contain speed information of the target, but also contain attitude information of the target, and the attitude and the speed value of the moving target can be obtained by analyzing and processing the sequence images. However, the precision of the camera in the shooting direction is not high, so that the posture precision of the missile in the posture directly acquired by common binocular intersection is not high.
The image measurement has good applicability to the landing process of the bullet, but the measurement accuracy of the posture of the bullet target is poor due to the fact that the observation distance is long and the intersection angle is small (smaller than 60 degrees), and the task requirement cannot be met by a general method for arranging a binocular camera on one side. Although the binocular small-angle intersection image measuring method has high two-dimensional measurement accuracy in an image plane, the measurement accuracy in the direction of the optical axis of the camera is poor.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to provide a method for measuring the high-precision attitude of a shot target under length constraint. The invention corrects the positioning error of the camera by combining the known missile length, and can realize the attitude precision within 0.1 degree.
In order to achieve the technical purpose, the technical scheme adopted by the invention is as follows:
a method for measuring the high-precision attitude of a bullet target under length constraint comprises the following steps:
firstly, a binocular stereoscopic vision intersection measuring system is built.
The binocular stereoscopic vision convergence measuring system comprises two CCD cameras with the same focal length; the two CCD cameras are arranged on two sides of the target in a bilateral symmetry mode, the missile target enters the two-phase machine view field area from the space between the left camera and the right camera, and the left camera and the right camera are used for recording images of the process from the time the missile enters the two-phase machine view field area to the time the missile target reaches the target.
And secondly, calibrating the binocular stereoscopic vision intersection measuring system to obtain the internal and external parameters of the left camera and the right camera in the binocular stereoscopic vision intersection measuring system.
And thirdly, shooting the missile target in the target hitting process by using a binocular stereoscopic vision convergence measuring system to obtain an image of the missile target when hitting.
And fourthly, acquiring world coordinates of the missile target.
World coordinates of the missile target include the missile nose pheadCoordinate (x)head,yhead,zhead) And tail ptailThe coordinate is (x)tail,ytail,ztail) (ii) a Calculating the warhead p by triangulation method based on the internal and external parameters of the left and right cameras obtained by calibration in the second stepheadCoordinate (x)head,yhead,zhead) And tail ptailCoordinate (x)tail,ytail,ztail)。
And fifthly, correcting the positioning error.
The length L from the missile warhead to the missile tail of the missile target is known, and L is used for the missile warhead coordinate pheadAnd missile tail coordinate ptailCorrecting to obtain corrected bullet guide head coordinate p'headAnd missile tail coordinate p'tail
Sixthly, according to the corrected bullet guide head coordinate p'headAnd missile tail coordinate p'tailAnd calculating the attitude of the missile target.
Further, in the first step of the present invention, the method for determining the position coordinates of the left and right cameras is as follows:
the middle point of the missile body when the missile target is perpendicular to the target is taken as an original point O', the direction perpendicular to the surface of the target is taken as a z-axis, and the straight line parallel to the surface of the target and perpendicular to the z-axis is taken as an x-axis.
The detection area is a square area which takes an original point O' as a center and has the side length of qxq (q is determined according to the length L from a missile target to the tail of the missile, and q is twice or more of the length of the missile), and the left camera and the right camera are arranged on the left side and the right side of the detection area; the circumscribed circle of the detection area qxq is taken as the effective field of view area of the left camera and the right camera,the effective field area has a center O' and a radius
Figure GDA0002358362820000031
The field angles of the two cameras are both 2 alpha, the included angles between the two cameras and the z-axis are both phi, and then the position coordinates of the left camera and the right camera are as follows:
Figure GDA0002358362820000032
Figure GDA0002358362820000033
in the invention, the calibration method of the camera adopts the conventional calibration method in the field. Firstly, collecting a calibration plate image, guiding the collected calibration plate image into Matlab, and then obtaining internal and external parameters of a left camera and a right camera by adopting a Zhang Zhengyou calibration method (Matlab with functions), wherein the internal parameters comprise focal lengths f of the left camera and the right cameraa、fbThe external parameters include a translation matrix T between the left and right camerasabAnd a rotation matrix Rab
In the invention, a translation matrix T between the left camera and the right cameraabAnd a rotation matrix RabThe following were used:
Figure GDA0002358362820000041
Figure GDA0002358362820000042
further, in the fourth step of the present invention, the leader pheadCoordinate (x)head,yhead,zhead) And tail ptailCoordinate (x)tail,ytail,ztail) The following were used:
Figure GDA0002358362820000043
Figure GDA0002358362820000044
in the formula (f)a、fbThe focal lengths of the left camera and the right camera are both f, (x)head,yhead,zhead) As a leader coordinate phead,(xtail,ytail,ztail) As the missile tail coordinate ptail,TabIs a translation matrix between the left and right cameras, RabIs a rotation matrix between the left and right cameras, (X)tail_a,Ytail_a)、(Xtail_b,Ytail_b) Coordinates of missile tail points under a left camera image plane coordinate system and a right camera image plane coordinate system are taken as image coordinates; (X)head_a,Yhead_a)、(Xhead_b,Yhead_b) The coordinates of the head point of the missile are in the coordinate systems of the left camera image plane and the right camera image plane.
Furthermore, in the fifth step of the present invention, the corrected projectile head coordinate p'headAnd missile tail coordinate p'tailThe following are:
(x'head-x'tail)2+(y'head-y'tail)2+(z'head-z′tail)2=L2
wherein (x'head,y'head,z'head) Missile head p 'for correcting positioning error'headWorld coordinate value of (x'tail,y'tail,z'tail) To correct for positioning error rear missile tail p'tailThe world coordinate value of (2).
Wherein, x'head=xhead,y′head=yhead,x′tail=xtail,y′tail=ytail,z′tail=ztailCorrected missile head z'headThe coordinate values are obtained by solving the following equation:
Figure GDA0002358362820000051
further, the posture of the missile target obtained in the sixth step of the invention is as follows:
Figure GDA0002358362820000052
Figure GDA0002358362820000053
Figure GDA0002358362820000054
Δx=x'head-x'tail
Δy=y'head-y'tail
Δz=z'head-z'tail
in the formula, alpha is a clip angle between a missile target and an x axis, beta is a clip angle between the missile target and a y axis, and gamma is a clip angle between the missile target and a z axis;
then alpha, beta and gamma are obtained by the calculation of the formula, namely the posture of the missile target.
Compared with the prior art, the invention has the beneficial technical effects that:
the invention provides a method for measuring the attitude of a missile body by using a binocular stereoscopic vision intersection measuring system, which comprises the steps of arranging two cameras of the binocular stereoscopic vision intersection measuring system at two sides of the rear side of the missile tail, and compensating the problem of insufficient accuracy of an image measuring system along the direction of the missile body by using the known length from the missile target warhead to the tail of the missile, so as to obtain a three-dimensional high-accuracy measuring result, thereby obtaining the high-accuracy missile body attitude.
Drawings
FIG. 1 is a schematic view of a binocular stereo vision convergence measurement system layout;
fig. 2 is a measuring schematic diagram of a binocular stereo vision convergence measuring system.
Detailed Description
The embodiment provides a method for measuring the high-precision attitude of a bullet target under length constraint, which comprises the following steps:
firstly, building a binocular stereoscopic vision intersection measuring system;
referring to fig. 1, the binocular stereoscopic vision intersection measuring system comprises two CCD cameras symmetrically arranged left and right, and the focal lengths of the left and right cameras are both f. As shown in figure 1, a left camera and a right camera are bilaterally symmetrical to a target, a missile to be shot enters a two-camera field of view area from the middle of the left camera and the right camera, the left camera and the right camera record images of the process from the time the missile enters the field of view to the time the missile target reaches the target, and the posture of the missile target landing process is analyzed through the images. ,
as shown in FIG. 1, the midpoint of the projectile body when the target of the missile is perpendicular to the target is taken as the origin O', the direction perpendicular to the target surface is taken as the z-axis, and the straight line parallel to the target surface and perpendicular to the z-axis is taken as the x-axis. The detection area is an area where the missile is located when the missile lands on the target, the detection area is a square area which takes an original point as a center and has the side length of qxq (q is determined according to the length L of the actual missile, and the length of q is twice the length of the missile in the embodiment), and the left camera and the right camera are arranged on the left side and the right side of the detection area.
The circumscribed circle of the detection area qxq is taken as the effective view field of the left camera and the right camera, the center of the effective view field is O', the center of the effective view field is the midpoint of the missile body when the missile is perpendicularly hit, and the radius is
Figure GDA0002358362820000071
Determining the layout positions of the two cameras according to the coordinate relationship, and obtaining the position coordinates of the left camera and the right camera as follows:
Figure GDA0002358362820000072
Figure GDA0002358362820000073
the field angle of the two cameras is 2 alpha, and the included angle between the two cameras and the z-axis is phi, so that the effective field of view satisfying the detection area q multiplied by q can be obtained.
Secondly, calibrating the binocular stereo vision intersection measuring system to obtain internal parameters (focal length f) of the left and right cameras and a translation matrix T between the left and right cameras as external parameters in the binocular stereo vision intersection measuring systemabAnd a rotation matrix Rab
Figure GDA0002358362820000074
Figure GDA0002358362820000075
In the formula, TabIs a translation matrix between the left camera a and the right camera b, RabIs a rotation matrix between the left camera a and the right camera b.
The calibration method of the camera adopts the conventional calibration method in the field, and the calibration is as follows:
(1) and collecting a calibration plate image.
Fixing the positions of the two cameras, placing the chessboard pattern calibration plate at the position of the missile target, enabling the chessboard pattern to occupy more camera field ranges as much as possible, sequentially changing the direction of the calibration plate, and obtaining five different images;
(2) calibrating the camera by a Zhang Zhengyou calibration method (Matlab with function), and obtaining the internal and external parameters of the camera, wherein the steps are as follows:
i) and importing the image. The Stereo Camera calibration kit attached to Matlab was opened, and the images captured by the left and right cameras in (1) were imported.
ii) fill in the actual value of the side length of each grid of the checkerboard, the checkerboard grid used in this example has a side length of 25 mm.
iii) calculating internal and external parameters. The image of the checkerboard can be previewed and successfully detected, then calibration is started, calibration results are obtained by clicking calibration, and the internal and external parameters are obtained.
And thirdly, shooting the missile in the target landing process by using a binocular stereoscopic vision convergence measuring system to obtain an image of the missile in the target landing process.
And fourthly, acquiring world coordinates of the bullet target.
World coordinates of the missile target include the missile nose pheadCoordinate (x)head,yhead,zhead) And tail ptailThe coordinate is (x)tail,ytail,ztail)。
Calculating the warhead p by triangulation method based on the internal and external parameters of the left and right cameras obtained by calibration in the second stepheadCoordinate (x)head,yhead,zhead) And tail ptailCoordinate (x)tail,ytail,ztail)。
Figure GDA0002358362820000081
Figure GDA0002358362820000091
In the formula (f)a、fbThe focal lengths of the left camera and the right camera are both f, (x)head,yhead,zhead) As a leader coordinate phead,(xtail,ytail,ztail) As the missile tail coordinate ptail,TabIs a translation matrix between the left and right cameras, RabIs a rotation matrix between the left and right cameras, (X)tail_a,Ytail_a)、(Xtail_b,Ytail_b) In the acquired image of the missile when the missile is in the target, the coordinates of the tail point of the missile under the image plane coordinate systems of the left camera and the right camera are image coordinates; (X)head_a,Yhead_a)、(Xhead_b,Yhead_b) The coordinates of the head point of the missile in the acquired image of the missile when the missile is in the target are under the coordinate systems of the image planes of the left camera and the right camera.
Fifthly, correcting the positioning error;
the length L from the missile warhead to the missile tail of the missile target is known, and L is used for the missile warhead coordinate pheadAnd missile tail coordinate ptailCorrecting to obtain corrected bullet guide head coordinate p'headAnd missile tail coordinate p'tail
The warhead coordinate p obtained by the previous stepheadIs P (x)head,yhead,zhead) Missile tail coordinate ptailIs P (x)tail,ytail,ztail). And correcting the coordinate values of the warhead and the tail by using L.
(x'head-x'tail)2+(y'head-y′tail)2+(z'head-z′tail)2=L2
Wherein (x'head,y'head,z'head) Missile head p 'for correcting positioning error'headWorld coordinate value of (x'tail,y'tail,z'tail) For correcting tail p of missile after positioning errortailThe world coordinate value of (2).
Due to the imaging characteristics of the camera, the coordinate precision of the image plane parallel to the camera is high, and the coordinate precision of the image plane perpendicular to the camera is low, so that the precision of x and y values of world coordinates of the tail of the warhead is high, and the precision of z value is poor. X, y coordinates before and after correction are unchanged, i.e. x'head=xhead,y′head=yhead,x′tail=xtail,y′tail=ytail. Because the tail of the missile is closer to the camera, the z coordinate of the tail of the missile is more accurate than the z coordinate of the head of the missile, and the z coordinate of the tail of the missile is assumedtailValue is correct, i.e. z'tail=ztailAnd then corrected missile head z'headThe coordinate values are obtained by solving the following equation:
Figure GDA0002358362820000101
and sixthly, calculating the posture of the missile target.
Using corrected leader tail p'head、p'tailThe world coordinates can be used for calculating the attitude of the missile, and the calculation process is as follows:
two points are known: p'head=(x'head,y'head,z'head),p'tail=(x'tail,y'tail,z'tail)
Figure GDA0002358362820000102
Figure GDA0002358362820000103
Figure GDA0002358362820000104
Δx=x′head-x′tail
Δy=y′head-y′tail
Δz=z'head-z'tail
In the formula, alpha is an included angle between the guided missile and an X axis, beta is an included angle between the guided missile and a Y axis, and gamma is an included angle between the guided missile and a Z axis.
Then alpha, beta and gamma can be calculated by the above formula, namely the posture of the missile.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (4)

1. A method for measuring the high-precision attitude of a bullet target under length constraint is characterized by comprising the following steps:
firstly, building a binocular stereoscopic vision intersection measuring system;
the binocular stereoscopic vision convergence measuring system comprises two CCD cameras with the same focal length; the two CCD cameras are arranged on the two sides of the target in a bilateral symmetry mode, the missile target enters a two-phase machine view field area from a position between the left camera and the right camera, and the left camera and the right camera are used for recording images of the process from the time that the missile enters the two-phase machine view field area to the time that the missile target reaches the target; the method for determining the position coordinates of the left camera and the right camera comprises the following steps:
taking the middle point of a missile body when a missile target is perpendicular to the target as an original point O', taking the direction perpendicular to the surface of the target as a z-axis, and taking a straight line parallel to the surface of the target and perpendicular to the z-axis as an x-axis;
the detection area is a square area with the origin O' as the center and the side length of q multiplied by q, and the left camera and the right camera are arranged on the left side and the right side of the detection area; using the circumscribed circle of the detection area qxq as the effective view field area of the left and right cameras, the circle center of the effective view field area is O', and the radius is
Figure FDA0003553705270000011
The field angles of the two cameras are both 2 alpha, and the included angles between the two cameras and the z axis are both
Figure FDA0003553705270000012
The position coordinates of the left and right cameras are:
Figure FDA0003553705270000013
Figure FDA0003553705270000014
secondly, calibrating the binocular stereoscopic vision intersection measuring system to obtain the binocular stereoscopic vision intersectionMeasuring internal and external parameters of a left camera and a right camera in the system; wherein the internal parameters of the left and right cameras comprise the focal lengths f of the left and right camerasa、fbThe external parameters of the left and right cameras include a translation matrix T between the left and right camerasabAnd a rotation matrix RabTranslation matrix T between left and right camerasabAnd a rotation matrix RabThe following were used:
Figure FDA0003553705270000021
Figure FDA0003553705270000022
thirdly, shooting the missile target in the target hitting process by using a binocular stereoscopic vision convergence measuring system to obtain an image of the missile target when hitting;
fourthly, acquiring world coordinates of the missile target;
world coordinates of the missile target include the missile nose pheadCoordinate (x)head,yhead,zhead) And tail ptailThe coordinate is (x)tail,ytail,ztail) (ii) a Calculating the warhead p by triangulation method based on the internal and external parameters of the left and right cameras obtained by calibration in the second stepheadCoordinate (x)head,yhead,zhead) And tail ptailCoordinate (x)tail,ytail,ztail);
Fifthly, correcting the positioning error;
the length L from the missile warhead to the missile tail of the missile target is known, and L is used for the missile warhead coordinate pheadAnd missile tail coordinate ptailCorrecting to obtain corrected bullet guide head coordinate p'headAnd missile tail coordinate p'tailThe following are:
(x'head-x'tail)2+(y'head-y'tail)2+(z'head-z'tail)2=L2
wherein (x'head,y'head,z'head) Missile head p 'for correcting positioning error'headWorld coordinate value of (x'tail,y'tail,z'tail) To correct for positioning error rear missile tail p'tailThe world coordinate value of (1);
wherein, x'head=xhead,y′head=yhead,x′tail=xtail,y′tail=ytail,z′tail=ztailCorrected missile head z'headThe coordinate values are obtained by solving the following equation:
Figure FDA0003553705270000031
sixthly, according to the corrected bullet guide head coordinate p'headAnd missile tail coordinate p'tailCalculating the attitude of the missile target;
Figure FDA0003553705270000032
Figure FDA0003553705270000033
Figure FDA0003553705270000034
Δx=x′head-x′tail
Δy=y′head-y′tail
Δz=z′head-z′tail
in the formula, alpha is a clip angle between a missile target and an x axis, beta is a clip angle between the missile target and a y axis, and gamma is a clip angle between the missile target and a z axis;
then alpha, beta and gamma are obtained by the calculation of the formula, namely the posture of the missile target.
2. The method for measuring the high-precision attitude of the missile target under the length constraint according to claim 1, wherein in the first step, q is determined according to the length L from the missile target's missile nose to the missile tail, and q is more than twice the length of the missile.
3. The method for measuring the high-precision attitude of the target under the length constraint according to claim 1, wherein in the second step, the calibration method comprises the following steps: firstly, collecting a calibration plate image, guiding the collected calibration plate image into Matlab, and then obtaining the internal and external parameters of the left and right cameras by adopting a Zhangyou calibration method in Matlab.
4. The method for measuring the high-precision attitude of a target under length constraint according to claim 1, wherein in the fourth step, the projectile head pheadCoordinate (x)head,yhead,zhead) And tail ptailCoordinate (x)tail,ytail,ztail) The following were used:
Figure FDA0003553705270000041
Figure FDA0003553705270000042
in the formula (f)a、fbThe focal lengths of the left camera and the right camera are both f, (x)head,yhead,zhead) As a leader coordinate phead,(xtail,ytail,ztail) As the missile tail coordinate ptail,TabIs a translation matrix between the left and right cameras, RabIs a rotation matrix between the left and right cameras,(Xtail_a,Ytail_a)、(Xtail_b,Ytail_b) Coordinates of missile tail points under a left camera image plane coordinate system and a right camera image plane coordinate system are taken as image coordinates; (X)head_a,Yhead_a)、(Xhead_b,Yhead_b) The coordinates of the head point of the missile are in the coordinate systems of the left camera image plane and the right camera image plane.
CN201911106730.9A 2019-11-13 2019-11-13 Method for measuring high-precision attitude of bullet target under length constraint Active CN110866954B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911106730.9A CN110866954B (en) 2019-11-13 2019-11-13 Method for measuring high-precision attitude of bullet target under length constraint

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911106730.9A CN110866954B (en) 2019-11-13 2019-11-13 Method for measuring high-precision attitude of bullet target under length constraint

Publications (2)

Publication Number Publication Date
CN110866954A CN110866954A (en) 2020-03-06
CN110866954B true CN110866954B (en) 2022-04-22

Family

ID=69654352

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911106730.9A Active CN110866954B (en) 2019-11-13 2019-11-13 Method for measuring high-precision attitude of bullet target under length constraint

Country Status (1)

Country Link
CN (1) CN110866954B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112556639B (en) * 2020-11-06 2023-05-09 广州艾目易科技有限公司 Device and method for testing actual effective visual field range of binocular vision system
CN117516481B (en) * 2024-01-08 2024-04-16 北京奥博泰科技有限公司 Dynamic image intersection measuring method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103247053A (en) * 2013-05-16 2013-08-14 大连理工大学 Accurate part positioning method based on binocular microscopy stereo vision
CN104315983A (en) * 2014-10-16 2015-01-28 天津大学 Method for increasing coordinate measurement field accuracy through space multi-length constraint
CN106981083A (en) * 2017-03-22 2017-07-25 大连理工大学 The substep scaling method of Binocular Stereo Vision System camera parameters
US9857172B1 (en) * 2017-09-25 2018-01-02 Beijing Information Science And Technology University Method for implementing high-precision orientation and evaluating orientation precision of large-scale dynamic photogrammetry system
CN107907048A (en) * 2017-06-30 2018-04-13 长沙湘计海盾科技有限公司 A kind of binocular stereo vision method for three-dimensional measurement based on line-structured light scanning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103247053A (en) * 2013-05-16 2013-08-14 大连理工大学 Accurate part positioning method based on binocular microscopy stereo vision
CN104315983A (en) * 2014-10-16 2015-01-28 天津大学 Method for increasing coordinate measurement field accuracy through space multi-length constraint
CN106981083A (en) * 2017-03-22 2017-07-25 大连理工大学 The substep scaling method of Binocular Stereo Vision System camera parameters
CN107907048A (en) * 2017-06-30 2018-04-13 长沙湘计海盾科技有限公司 A kind of binocular stereo vision method for three-dimensional measurement based on line-structured light scanning
US9857172B1 (en) * 2017-09-25 2018-01-02 Beijing Information Science And Technology University Method for implementing high-precision orientation and evaluating orientation precision of large-scale dynamic photogrammetry system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A High Accuracy Attitude Adjustment Method for Spacecraft Mass Property Measurement;Rongrong Yu et al.;《2018 5th International Conference on Systems and Informatics(ICSAI)》;IEEE;20190103;全文 *
空中目标姿态测量技术及其仿真实验研究;李清安;《中国优秀博硕士学位论文全文数据库(博士)工程科技Ⅱ辑》;20071015(第04期);全文 *

Also Published As

Publication number Publication date
CN110866954A (en) 2020-03-06

Similar Documents

Publication Publication Date Title
CN109242908B (en) Calibration method for underwater binocular vision measurement system
CN109522935B (en) Method for evaluating calibration result of binocular vision measurement system
CN107194974B (en) Method for improving multi-view camera external parameter calibration precision based on multiple recognition of calibration plate images
CN109297436B (en) Binocular line laser stereo measurement reference calibration method
CN108489398B (en) Method for measuring three-dimensional coordinates by laser and monocular vision under wide-angle scene
CN107560603B (en) Unmanned aerial vehicle oblique photography measurement system and measurement method
CN110866954B (en) Method for measuring high-precision attitude of bullet target under length constraint
CN106709955B (en) Space coordinate system calibration system and method based on binocular stereo vision
CN111854622B (en) Large-field-of-view optical dynamic deformation measurement method
CN115187658B (en) Multi-camera visual large target positioning method, system and equipment
CN110763140B (en) Non-parallel optical axis high-precision binocular ranging method
CN109974618B (en) Global calibration method of multi-sensor vision measurement system
CN108180888A (en) A kind of distance detection method based on rotating pick-up head
CN104482924A (en) Revolution body object pose vision measurement method
CN103198481A (en) Camera calibration method and achieving system of same
CN115359127A (en) Polarization camera array calibration method suitable for multilayer medium environment
CN108896017B (en) Method for measuring and calculating position parameters of projectile near-explosive fragment groups
CN107976146B (en) Self-calibration method and measurement method of linear array CCD camera
CN108253935B (en) Ultra-high-speed free flight attitude measurement method for complex-shape model
CN108180829B (en) It is a kind of that measurement method is directed toward to the object space with parallel lines feature
CN113393413B (en) Water area measuring method and system based on monocular and binocular vision cooperation
CN114078163A (en) Precise calibration method for laser radar and visible light camera
CN116147582B (en) Underwater photogrammetry positioning and orientation method
CN113012279B (en) Non-contact three-dimensional imaging measurement method and system and computer readable storage medium
CN112762829B (en) Target coordinate measuring method and system based on linkage deflection type active vision system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant