CN115546318B - Automatic high-speed trajectory calibration method - Google Patents

Automatic high-speed trajectory calibration method Download PDF

Info

Publication number
CN115546318B
CN115546318B CN202211473467.9A CN202211473467A CN115546318B CN 115546318 B CN115546318 B CN 115546318B CN 202211473467 A CN202211473467 A CN 202211473467A CN 115546318 B CN115546318 B CN 115546318B
Authority
CN
China
Prior art keywords
camera
coordinate system
point
coordinates
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211473467.9A
Other languages
Chinese (zh)
Other versions
CN115546318A (en
Inventor
牛威
郝磊
刘帅
鱼群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongke Xingtu Measurement And Control Technology Co ltd
Original Assignee
Zhongke Xingtu Measurement And Control Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongke Xingtu Measurement And Control Technology Co ltd filed Critical Zhongke Xingtu Measurement And Control Technology Co ltd
Priority to CN202211473467.9A priority Critical patent/CN115546318B/en
Publication of CN115546318A publication Critical patent/CN115546318A/en
Application granted granted Critical
Publication of CN115546318B publication Critical patent/CN115546318B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/60Rotation of a whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Abstract

An automatic high-speed trajectory calibration method comprises the following steps: s1, fixing a sensor on a load platform, resolving a binocular camera, reconstructing three-dimensional point coordinates under a binocular space coordinate system, and establishing a camera geometric model; s2, fixing the laser sighting device to a load platform, and collecting three-dimensional point coordinates of a plurality of laser rays at different depths; s3, fitting a space linear equation of the laser ray based on the RANSAC algorithm; s4, collecting three-dimensional point coordinates of an image aiming point and a laser aiming point; s5, calculating rotation angles of the horizontal axis and the vertical axis of the load platform through three-dimensional coordinates; and S6, transmitting the rotation angle of the load platform to finish the automatic tracking and aiming of the transmitting device. The method and the device can reduce the requirement on transmitting personnel, can accurately and quickly lock the target point position in a severe scene, and realize efficient and quick completion of tasks.

Description

Automatic high-speed trajectory calibration method
Technical Field
The invention relates to the technical field of robots, in particular to an automatic high-speed trajectory calibration method.
Background
In daily life, apart from the scenery which can be seen in front of eyes, things far away need to be seen in many times, but the naked eye observation has certain limitation, and things far away can be seen and are difficult to see. Under the condition of simple observation, the object magnification can be enlarged through the telescope, and objects at far distance can be drawn. And as some launching behaviors, the accurate aiming of the target is needed, so that auxiliary tools such as a sighting telescope are needed to help us. Sighting telescope is the auxiliary design device that is assembled on the emitter on military defence generally. Both optical sighting telescope and night vision sighting telescope have become necessary sighting devices in military. In military training and other preparatory tasks, various types of launch equipment require aiming devices, which are critical to ensure accurate aiming and efficient execution of military missions. Therefore, how to accurately and efficiently aim and lock the target point position of the object in military missions is an urgent problem to be solved.
Depending on the sighting telescope, manually aiming the target at a long distance is the most common and convenient traditional aiming mode, and a launcher can quickly finish a launching task in a specific scene through the mode. However, since each shooter has a different degree of mastery of aiming skills, experience in use is different, and some departments do not necessarily hold corresponding professional certificates. With the continuous updating of the launching device, more precise and complex equipment also puts higher requirements on professional workers, and the daily aiming task faces the following difficulties:
(1) Although manual aiming has high requirements on the physical quality of a launcher, some tasks have severe environments, and the aiming posture is difficult to maintain stably for a long time, so that the aiming precision is influenced;
(2) When the target point is changed, the emitting personnel needs to aim and estimate the target point again, and the task cannot be finished efficiently and quickly.
Disclosure of Invention
In order to solve the requirements of aiming at the shooting personnel and the environment in the background technology and quickly finish the estimation of the trajectory in the space, the invention provides an automatic high-speed trajectory calibration method, which comprises the following specific steps:
s1, fixing a sensor on a load platform, resolving a binocular camera, reconstructing three-dimensional point coordinates under a binocular space coordinate system, and establishing a camera geometric model;
s2, fixing the laser sighting device to a load platform, and collecting three-dimensional point coordinates of a plurality of laser rays at different depths;
s3, fitting a space linear equation of the laser light based on an RANSAC algorithm;
s4, collecting three-dimensional point coordinates of an image aiming point and a laser aiming point;
s5, calculating rotation angles of the horizontal and vertical shafts of the load platform through the three-dimensional coordinates;
and S6, transmitting the rotation angle of the load platform to complete automatic tracking and aiming of the transmitting device.
Specifically, step S1 is as follows:
s11, obtaining a set model of the monocular camera through conversion of a coordinate system, and calculating camera internal parameters;
s12, obtaining a binocular camera geometric model through camera internal parameters; the binocular geometry model also needs to take into account an external parameter, i.e., the roto-translational relationship between the left and right cameras.
Specifically, step S11 is specifically as follows:
s111, converting the world coordinate system
Figure 100002_DEST_PATH_IMAGE001
Rigid body transformation to camera coordinate system->
Figure 4
The world coordinate is a reference coordinate system of the position of a real object in a three-dimensional space, a camera coordinate system is used as a space coordinate system, and a left camera optical center is used as a coordinate origin of the world coordinate; the camera coordinates take the optical center of the camera itself as the origin of coordinates, wherein the Z axis is parallel to the optical axis of the camera; the conversion is specifically as follows:
Figure 100002_DEST_PATH_IMAGE003
wherein
Figure 928596DEST_PATH_IMAGE004
For rotating the matrix, is->
Figure 100002_DEST_PATH_IMAGE005
Is a translation matrix;
s112, camera coordinate system
Figure 36229DEST_PATH_IMAGE006
Projection into the image coordinate system by transmission>
Figure 1
The image coordinate system is a coordinate system established by taking a two-dimensional image shot by a camera as a reference, and is used for indicating the position of the measured object in the image, and the coordinate system comprises a continuous image coordinate and/or a space image coordinate; the origin of the coordinate system of the image coordinates lies in the focal point of the camera optical axis and the imaging plane->
Figure 697018DEST_PATH_IMAGE008
Upper, in mm; />
Figure 2
A perspective projection relationship for representing the translation of the object from the camera coordinate system to the image coordinate system, the projection, the transmission projection calculated as:
Figure 839286DEST_PATH_IMAGE010
wherein
Figure 100002_DEST_PATH_IMAGE011
Is the camera internal reference;
s113, image coordinate system
Figure 927328DEST_PATH_IMAGE012
By discretizing into a pixel coordinate system->
Figure 100002_DEST_PATH_IMAGE013
Performing the following steps; the pixel coordinate system is a discrete image coordinate system, the origin is at the upper left corner of the image, and the unit is a pixel; is converted as follows
Figure 23722DEST_PATH_IMAGE014
Wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE015
represents the actual width of the unit pixel in the x-axis direction on the CCD of the camera, and>
Figure 488202DEST_PATH_IMAGE016
representing the actual width of the unit pixel in the y-axis direction on the CCD of the camera; pixel coordinates correspond to discretization of the x-axis and the y-axis; the homogeneous coordinates of the above formula are written in a matrix form as follows: />
Figure DEST_PATH_IMAGE017
The inverse relationship can be written as:
Figure 688239DEST_PATH_IMAGE018
wherein
Figure DEST_PATH_IMAGE019
、/>
Figure 9499DEST_PATH_IMAGE020
、/>
Figure 498249DEST_PATH_IMAGE021
、/>
Figure 100002_DEST_PATH_IMAGE022
Is the camera internal reference.
Specifically, the extrinsic parameter calculation method is as follows:
assuming that the world coordinate system coincides with the left camera coordinate system,
Figure 563157DEST_PATH_IMAGE023
represents the left camera coordinate system, and>
Figure DEST_PATH_IMAGE024
represents the image coordinate system, is>
Figure 617701DEST_PATH_IMAGE025
Represents the left camera focal length; />
Figure 100002_DEST_PATH_IMAGE026
Represents the right camera coordinate system, and>
Figure 811659DEST_PATH_IMAGE027
represents the image coordinate system, is>
Figure 850022DEST_PATH_IMAGE028
Represents a right camera focal length; p is any point in space, and the projection points of the P point on the left camera and the right camera are respectively ^ and ^>
Figure 859567DEST_PATH_IMAGE029
Figure DEST_PATH_IMAGE030
The following relation can be obtained by the camera perspective transformation model:
Figure 830934DEST_PATH_IMAGE031
(1)
Figure DEST_PATH_IMAGE032
(2)
assuming a rotation matrix
Figure 697259DEST_PATH_IMAGE033
The translation vector is->
Figure 160601DEST_PATH_IMAGE034
Then, the following transformation relationship is given:
Figure 100002_DEST_PATH_IMAGE035
(3)
as can be seen from (1), (2), and (3), the correspondence between the pixel points of the left and right cameras is as follows:
Figure 567312DEST_PATH_IMAGE036
thus, three-dimensional spatial points in the world coordinate system
Figure 799710DEST_PATH_IMAGE037
Can be expressed as the following formula (4), and realizes the world coordinate calculation of any point of the common view field of the camera
Figure 102515DEST_PATH_IMAGE038
(4)。
Specifically, step S3 specifically includes:
s31, randomly selecting the minimum sample point coordinate capable of estimating the space linear model, and initializing the number of points in an error range, namely, the number of inner points, wherein the number of the optimal inner points is N, and the model threshold value t and the optimal iteration number K are obtained; for the straight line fitting condition, the minimum data set is 2 point coordinates;
s32, calculating a spatial straight line model by using the minimum data set;
s33, substituting all data into the space straight line model, counting the number of inner points and coordinates thereof by calculating the distance from the points to the straight line, and taking the coordinates as an optimal inner point coordinate set;
s34, judging whether the current iteration number ck is larger than or equal to the optimal iteration number k, if so, directly entering a step S36, and if not, entering a step S35;
s35, judging whether the current inner point number CN is larger than the optimal inner point number N during initialization, if so, assigning CN to N as the optimal inner point number N, returning to the step S31, and if not, entering the step S32;
and S36, substituting the optimal inner point coordinate set into a least square fitting algorithm, and solving an optimal space linear equation.
Specifically, step S4 specifically includes: metrology within image vision systemsCollecting the coordinate P of the laser aiming point view (ii) a Then, based on the optimal space straight line model in step S3, the current target aiming point P is obtained by substituting the current target point bit depth value z view Laser aiming point P under depth laser (ii) a At the moment, three-dimensional points P of two world coordinate systems under the binocular system are obtained view 、P laser And calculating the horizontal and vertical angles between two points in space through the conversion relation of the space coordinates.
Specifically, step S5 specifically includes:
rotating the laser aiming point to the target aiming point in two steps, dividing the angle into a transverse delta H and a vertical delta V, and aiming the point to P view And P laser The distance between the two points can be obtained through binocular resolving and is set as the depth z;
the rotation angle of the loading platform can be calculated in two steps:
Figure DEST_PATH_IMAGE039
the invention has the beneficial effects that: the method and the device can reduce the requirement on transmitting personnel, can accurately and quickly lock the target point position in a severe scene, and realize efficient and quick completion of tasks.
Drawings
Fig. 1 and fig. 2 are structural diagrams of an automatic high-speed ballistic calibration method according to the present invention.
Fig. 3 is a connection diagram of unit modules of an automatic high-speed trajectory calibration method according to the present invention.
Fig. 4 is a schematic diagram of laser sight index point acquisition.
FIG. 5 is a diagram of a calibration point acquisition and detection process.
Fig. 6 is a three-dimensional index point acquisition visualization.
Fig. 7 is a RANSAC spatial line fitting flowchart.
FIG. 8 is a best interior plot diagram for fitting a straight line after RANSAC screening.
FIG. 9 is a schematic diagram illustrating the calculation of the angle between the target point and the aiming point.
Detailed Description
As shown in fig. 1-3, an automatic high-speed ballistic calibration method includes the following steps:
s1, fixing a sensor on a load platform, resolving a binocular camera, reconstructing three-dimensional point coordinates under a binocular space coordinate system, and establishing a camera geometric model; the method comprises the following specific steps:
s11, obtaining a set model of the monocular camera through conversion of a coordinate system, and calculating camera internal parameters;
s111, converting the world coordinate system
Figure 85777DEST_PATH_IMAGE001
Rigid body transformation to camera coordinate system->
Figure 3
The world coordinate is a reference coordinate system of the position of a real object in a three-dimensional space, a camera coordinate system is used as a space coordinate system, and for calculation convenience, a left camera optical center is used as a coordinate origin of the world coordinate; the camera coordinates take the optical center of the camera itself as the origin of coordinates, wherein the Z axis is parallel to the optical axis of the camera; the conversion is specifically as follows:
Figure 648662DEST_PATH_IMAGE003
wherein
Figure 325631DEST_PATH_IMAGE004
For rotating the matrix, is->
Figure 294724DEST_PATH_IMAGE005
Is a translation matrix.
S112, camera coordinate system
Figure 512079DEST_PATH_IMAGE006
Projection in transmission into an image coordinate system>
Figure 719069DEST_PATH_IMAGE007
The image coordinate system is a coordinate system established by taking a two-dimensional image shot by a camera as a reference, and is used for indicating the position of the measured object in the image, and the image coordinate system comprises continuous image coordinates and/or space image coordinates. The origin of the coordinate system of the image coordinates is located at the focal point ≥ of the optical axis of the camera and the imaging plane>
Figure 629257DEST_PATH_IMAGE008
Upper, in mm. />
Figure 85646DEST_PATH_IMAGE007
A perspective projection relationship for representing the translation of the object from the camera coordinate system to the image coordinate system, the projection, the transmission projection calculated as:
Figure 44374DEST_PATH_IMAGE010
wherein
Figure 902609DEST_PATH_IMAGE011
Is the camera internal reference.
S113, image coordinate system
Figure 234232DEST_PATH_IMAGE012
By discretizing into a pixel coordinate system->
Figure 381179DEST_PATH_IMAGE013
In (1). The pixel coordinate system is a discrete image coordinate system, the origin is at the upper left corner of the image, and the unit is a pixel. Is converted as follows
Figure 205916DEST_PATH_IMAGE014
Wherein the content of the first and second substances,
Figure 918657DEST_PATH_IMAGE015
represents the actual width of the unit pixel in the x-axis direction on the CCD of the camera, and>
Figure 842750DEST_PATH_IMAGE016
representing the actual width of the y-axis unit pixel (single CCD or CMOS sensor size) across the camera CCD. The pixel coordinates correspond to a discretization of the x-axis and y-axis. The homogeneous coordinates of the above formula are written in a matrix form as follows:
Figure 539311DEST_PATH_IMAGE017
the inverse relationship can be written as:
Figure 839842DEST_PATH_IMAGE018
wherein
Figure 672669DEST_PATH_IMAGE019
、/>
Figure 829981DEST_PATH_IMAGE020
、/>
Figure 951521DEST_PATH_IMAGE021
、/>
Figure 852481DEST_PATH_IMAGE022
Is the camera internal reference.
S12, obtaining a binocular camera geometric model through camera internal parameters; the binocular geometric model also needs to consider external parameters, namely the rotational-translational relation between the left camera and the right camera, and the calculation method of the external parameters is as follows:
assuming that the world coordinate system coincides with the left camera coordinate system,
Figure 539814DEST_PATH_IMAGE023
represents the left camera coordinate system, and>
Figure 71289DEST_PATH_IMAGE024
represents an image coordinate system>
Figure 712749DEST_PATH_IMAGE025
Represents the left camera focal length; />
Figure 682979DEST_PATH_IMAGE026
Represents the right camera coordinate system, and>
Figure 428081DEST_PATH_IMAGE027
represents the image coordinate system, is>
Figure 192774DEST_PATH_IMAGE028
Represents a right camera focal length; p is any point in space, and the projection points of the P point on the left camera and the right camera are respectively ^ and ^>
Figure 820065DEST_PATH_IMAGE029
、/>
Figure 266090DEST_PATH_IMAGE030
The binocular camera geometric model is shown in fig. 3:
the following relation can be obtained by transforming the model through the camera perspective:
Figure 662436DEST_PATH_IMAGE031
(1)
Figure 598031DEST_PATH_IMAGE032
(2)
assuming a rotation matrix
Figure 915880DEST_PATH_IMAGE033
The translation vector is->
Figure 696754DEST_PATH_IMAGE034
Then, the following transformation relationship is given:
Figure 478765DEST_PATH_IMAGE035
(3)
as can be seen from (1), (2), and (3), the correspondence between the pixel points of the left and right cameras is as follows:
Figure 522945DEST_PATH_IMAGE036
thus, three-dimensional spatial points in the world coordinate system
Figure 623362DEST_PATH_IMAGE037
Can be expressed as the following formula (4), and realizes the world coordinate calculation of any point of the common view field of the camera
Figure 473507DEST_PATH_IMAGE038
(4)。
S2, fixing the laser sighting device to a load platform, and collecting three-dimensional point coordinates of a plurality of laser rays with different depths; the method comprises the following specific steps:
as shown in fig. 4, the laser sight is fixed to a load platform, the angle of the load platform is adjusted, so that a laser beam emitted is positioned in a contactable range of a calibration worker, then a laser spot emitted by laser is imaged on a receiving white board (or white cardboard), and three-dimensional point coordinates of laser spots P1 and P2.. Pn at different distances are acquired through a binocular camera; in this example, the laser point data acquisition and detection process is as shown in fig. 5 (a pair of left and right images is a set of spatial points, and the black position is an abnormal value), the obtained laser sighting device calibration point is visualized as shown in fig. 6, and a series of points obtained in a three-dimensional space are basically located in the same spatial straight line.
S3, fitting a space linear equation of the laser light based on an RANSAC algorithm; the method comprises the following specific steps:
the RANSAC algorithm divides all sample data sets into medium correct data (inerals, sample data that the model can describe), and abnormal data (outlies, sample data that is far from a normal range and cannot conform to the mathematical model), which may be generated due to erroneous measurement, erroneous assumption, erroneous calculation, and the like, assuming that some samples in the data sets have large noise. Based on RANSAC algorithm, fitting a space linear equation of laser light, estimating a model by repeatedly selecting a data set, and iterating until the model which is considered to be better is estimated. The specific implementation steps are shown in fig. 7 and are divided into the following steps:
s31, randomly selecting the minimum sample point coordinate capable of estimating the space linear model, and initializing the number of points in an error range, namely, the number of inner points, wherein the number of the optimal inner points is N, and the model threshold value t and the optimal iteration number K are obtained; for the straight line fitting condition, the minimum data set is 2 point coordinates;
s32, calculating a spatial straight line model by using the minimum data set;
s33, substituting all data into the space straight line model, counting the number of inner points and coordinates thereof by calculating the distance from the points to the straight line, and taking the coordinates as an optimal inner point coordinate set;
s34, judging whether the current iteration number ck is larger than or equal to the optimal iteration number k, if so, directly entering a step S36, and if not, entering a step S35;
s35, judging whether the current inner point number CN is larger than the optimal inner point number N during initialization, if so, assigning CN to N as the optimal inner point number N, returning to the step S31, and if not, entering the step S32;
and S36, substituting the optimal inner point coordinate set into a least square fitting algorithm, and solving an optimal space linear equation.
Fig. 8 shows the three-dimensional visualization effect of the interior points screened out from the collection space points by the RANSAC algorithm.
S4, collecting three-dimensional point coordinates of an image aiming point and a laser aiming point; specifically, measurement is carried out in an image vision system, and the coordinate P of a laser aiming point is collected view . Then, based on the optimal space straight line model in step S3, the current target aiming point P is obtained by substituting the current target point bit depth value z view Laser aiming point P under depth laser . At the moment, three-dimensional points P of two world coordinate systems under the binocular system are obtained view 、P laser Through the spaceAnd (4) calculating horizontal and vertical angles between two points in space according to the transformation relation of the coordinates.
S5, calculating rotation angles of the horizontal axis and the vertical axis of the load platform through three-dimensional coordinates;
setting a laser aiming point P view Laser aiming point P laser In the image of fig. 9, the laser aiming point is now rotated to the target aiming point (the real position to be aimed) in two steps, the angle is divided into a transverse direction δ H and a direction δ V perpendicular to the transverse direction δ H, and the aiming point is located at P view And P laser The distance therebetween can be obtained by binocular solution (set as the depth z), and the schematic calculation thereof is shown in fig. 9.
The rotation angle of the loading platform can be calculated in two steps:
Figure 47707DEST_PATH_IMAGE039
and S6, transmitting the rotation angle of the load platform to complete automatic tracking and aiming of the transmitting device.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.

Claims (7)

1. An automatic high-speed trajectory calibration method is characterized by comprising the following steps:
s1, fixing a sensor on a load platform, resolving a binocular camera, reconstructing three-dimensional point coordinates under a binocular space coordinate system, and establishing a camera geometric model;
s2, fixing the laser sighting device to a load platform, and collecting three-dimensional point coordinates of a plurality of laser rays at different depths;
s3, fitting a space linear equation of the laser light based on an RANSAC algorithm;
s4, collecting three-dimensional point coordinates of an image aiming point and a laser aiming point;
s5, calculating rotation angles of the horizontal and vertical shafts of the load platform through the three-dimensional coordinates;
and S6, transmitting the rotation angle of the load platform to complete automatic tracking and aiming of the transmitting device.
2. The automatic high-speed ballistic calibration method according to claim 1, wherein step S1 is specifically as follows:
s11, obtaining a set model of the monocular camera through the conversion of a coordinate system, and calculating camera internal parameters;
s12, obtaining a binocular camera geometric model through camera internal parameters; the binocular geometry model also needs to take into account an external parameter, i.e., the roto-translational relationship between the left and right cameras.
3. The automatic high-speed ballistic calibration method according to claim 2, wherein step S11 is specifically as follows:
s111, converting the world coordinate system
Figure DEST_PATH_IMAGE001
Rigid body transformation to camera coordinate system->
Figure 372799DEST_PATH_IMAGE002
The world coordinate is a reference coordinate system of the position of a real object in a three-dimensional space, a camera coordinate system is used as a space coordinate system, and the optical center of a left camera is used as the origin of coordinates of the world coordinate; the camera coordinates take the optical center of the camera itself as the origin of coordinates, wherein the Z axis is parallel to the optical axis of the camera; the conversion is specifically as follows:
Figure DEST_PATH_IMAGE003
wherein
Figure 86677DEST_PATH_IMAGE004
For rotating the matrix, is->
Figure DEST_PATH_IMAGE005
Is a translation matrix;
s112, camera coordinate system
Figure 663152DEST_PATH_IMAGE002
Projection into the image coordinate system by transmission>
Figure 323941DEST_PATH_IMAGE006
The image coordinate system is a coordinate system established by taking a two-dimensional image shot by a camera as a reference, and is used for indicating the position of a measured object in the image, and comprises continuous image coordinates and/or space image coordinates; the origin of the coordinate system of the image coordinates is located at the focal point ≥ of the optical axis of the camera and the imaging plane>
Figure DEST_PATH_IMAGE007
Upper, in mm; />
Figure 935050DEST_PATH_IMAGE006
A perspective projection relationship for representing the translation of the object from the camera coordinate system to the image coordinate system, the projection, the transmission projection calculated as:
Figure 793066DEST_PATH_IMAGE008
wherein
Figure DEST_PATH_IMAGE009
Is the camera internal reference; />
S113, image coordinate system
Figure 856837DEST_PATH_IMAGE010
By discretizing into a pixel coordinate system->
Figure DEST_PATH_IMAGE011
Performing the following steps; the pixel coordinate system is a discrete image coordinateThe origin is at the upper left corner of the image, and the unit is pixel; is converted as follows
Figure 852475DEST_PATH_IMAGE012
Wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE013
represents the actual width of the unit pixel in the x-axis direction on the CCD of the camera, and>
Figure 318091DEST_PATH_IMAGE014
representing the actual width of the unit pixel in the y-axis direction on the CCD of the camera; pixel coordinates correspond to discretization of the x-axis and the y-axis; the homogeneous coordinates of the above formula are written in a matrix form as follows:
Figure DEST_PATH_IMAGE015
the inverse relationship can be written as:
Figure 108193DEST_PATH_IMAGE016
wherein
Figure 128101DEST_PATH_IMAGE017
、/>
Figure DEST_PATH_IMAGE018
、/>
Figure 163316DEST_PATH_IMAGE019
、/>
Figure DEST_PATH_IMAGE020
Is the camera internal reference.
4. The automatic high-speed ballistic calibration method according to claim 1, characterized in that the extrinsic parameter calculation method is as follows:
assuming that the world coordinate system coincides with the left camera coordinate system,
Figure 483438DEST_PATH_IMAGE021
represents the left camera coordinate system, and>
Figure DEST_PATH_IMAGE022
represents the image coordinate system, is>
Figure 710020DEST_PATH_IMAGE023
Represents the left camera focal length; />
Figure 217225DEST_PATH_IMAGE024
Represents the right camera coordinate system, and>
Figure 23507DEST_PATH_IMAGE025
represents the image coordinate system, is>
Figure DEST_PATH_IMAGE026
Represents a right camera focal length; p is any point in space, and the projection points of the P point on the left camera and the right camera are respectively ^ and ^>
Figure 198137DEST_PATH_IMAGE027
、/>
Figure DEST_PATH_IMAGE028
The following relation can be obtained by transforming the model through the camera perspective:
Figure 330041DEST_PATH_IMAGE029
(1)/>
Figure 823077DEST_PATH_IMAGE030
(2)
assuming a rotation matrix
Figure DEST_PATH_IMAGE031
The translation vector is->
Figure 698629DEST_PATH_IMAGE032
Then, the following transformation relationship is given:
Figure 727765DEST_PATH_IMAGE033
(3)
as can be seen from (1), (2), and (3), the correspondence between the pixel points of the left and right cameras is as follows:
Figure 764991DEST_PATH_IMAGE034
thus, three-dimensional spatial points in the world coordinate system
Figure DEST_PATH_IMAGE035
Can be expressed as the following formula (4), and realizes the world coordinate calculation of any point of the common view field of the camera
Figure 512367DEST_PATH_IMAGE036
(4)。
5. The automatic high-speed ballistic calibration method according to claim 1, wherein step S3 specifically comprises:
s31, randomly selecting the minimum sample point coordinate capable of estimating the space linear model, and initializing the number of points in an error range, namely, the number of inner points, wherein the number of the optimal inner points is N, and the model threshold value t and the optimal iteration number K are obtained; for the straight line fitting condition, the minimum data set is 2 point coordinates;
s32, calculating a spatial straight line model by using the minimum data set;
s33, substituting all data into the space straight line model, counting the number of inner points and coordinates thereof by calculating the distance from the points to the straight line, and taking the coordinates as an optimal inner point coordinate set;
s34, judging whether the current iteration number ck is larger than or equal to the optimal iteration number k, if so, directly entering a step S36, and if not, entering a step S35;
s35, judging whether the current inner point number CN is larger than the optimal inner point number N during initialization, if so, assigning CN to N as the optimal inner point number N, returning to the step S31, and if not, entering the step S32;
and S36, substituting the optimal inner point coordinate set into a least square fitting algorithm, and solving an optimal space linear equation.
6. The automatic high-speed trajectory calibration method according to claim 1, wherein step S4 specifically comprises: measuring in image vision system, and collecting laser aiming point coordinate P view (ii) a Then, based on the optimal space straight line model in step S3, the current target aiming point P is obtained by substituting the depth value z of the current target point bit view Laser aiming point P under depth laser (ii) a At the moment, three-dimensional points P of two world coordinate systems under the binocular system are obtained view 、P laser And calculating the horizontal and vertical angles between two points in space through the conversion relation of the space coordinates.
7. The automatic high-speed ballistic calibration method according to claim 6, wherein step S5 specifically comprises:
rotating the laser aiming point to the target aiming point in two steps, dividing the angle into a transverse delta H and a vertical delta V, and aiming the point to P view And P laser The distance between the two points can be obtained through binocular resolving and set as the depth z;
the rotation angle of the loading platform can be calculated in two steps:
Figure DEST_PATH_IMAGE037
。/>
CN202211473467.9A 2022-11-23 2022-11-23 Automatic high-speed trajectory calibration method Active CN115546318B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211473467.9A CN115546318B (en) 2022-11-23 2022-11-23 Automatic high-speed trajectory calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211473467.9A CN115546318B (en) 2022-11-23 2022-11-23 Automatic high-speed trajectory calibration method

Publications (2)

Publication Number Publication Date
CN115546318A CN115546318A (en) 2022-12-30
CN115546318B true CN115546318B (en) 2023-04-07

Family

ID=84719980

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211473467.9A Active CN115546318B (en) 2022-11-23 2022-11-23 Automatic high-speed trajectory calibration method

Country Status (1)

Country Link
CN (1) CN115546318B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116718109A (en) * 2023-02-10 2023-09-08 深圳市中图仪器股份有限公司 Target capturing method based on binocular camera
CN116823937B (en) * 2023-08-28 2024-02-23 成都飞机工业(集团)有限责任公司 High-precision quick aiming method for plane horizontal point based on visual guidance

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103105858A (en) * 2012-12-29 2013-05-15 上海安维尔信息科技有限公司 Method capable of amplifying and tracking goal in master-slave mode between fixed camera and pan tilt zoom camera
CN103679741A (en) * 2013-12-30 2014-03-26 北京建筑大学 Method for automatically registering cloud data of laser dots based on three-dimensional line characters
CN106327532A (en) * 2016-08-31 2017-01-11 北京天睿空间科技股份有限公司 Three-dimensional registering method for single image
WO2017195801A1 (en) * 2016-05-13 2017-11-16 オリンパス株式会社 Calibration device, calibration method, optical device, imaging device, projection device, measurement system and measurement method
CN107741175A (en) * 2017-10-21 2018-02-27 聚鑫智能科技(武汉)股份有限公司 A kind of artificial intelligence fine sight method and system
CN110142805A (en) * 2019-05-22 2019-08-20 武汉爱速达机器人科技有限公司 A kind of robot end's calibration method based on laser radar
CN110942477A (en) * 2019-11-21 2020-03-31 大连理工大学 Method for depth map fusion by using binocular camera and laser radar
CN111356893A (en) * 2019-02-28 2020-06-30 深圳市大疆创新科技有限公司 Shooting aiming control method and device for movable platform and readable storage medium
CN111803842A (en) * 2020-08-08 2020-10-23 应急管理部上海消防研究所 Automatic aiming device of fire monitor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113013781B (en) * 2021-02-05 2022-04-01 安阳一都网络科技有限公司 Laser emission and dynamic calibration device, method, equipment and medium based on image processing

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103105858A (en) * 2012-12-29 2013-05-15 上海安维尔信息科技有限公司 Method capable of amplifying and tracking goal in master-slave mode between fixed camera and pan tilt zoom camera
CN103679741A (en) * 2013-12-30 2014-03-26 北京建筑大学 Method for automatically registering cloud data of laser dots based on three-dimensional line characters
WO2017195801A1 (en) * 2016-05-13 2017-11-16 オリンパス株式会社 Calibration device, calibration method, optical device, imaging device, projection device, measurement system and measurement method
CN106327532A (en) * 2016-08-31 2017-01-11 北京天睿空间科技股份有限公司 Three-dimensional registering method for single image
CN107741175A (en) * 2017-10-21 2018-02-27 聚鑫智能科技(武汉)股份有限公司 A kind of artificial intelligence fine sight method and system
CN111356893A (en) * 2019-02-28 2020-06-30 深圳市大疆创新科技有限公司 Shooting aiming control method and device for movable platform and readable storage medium
CN110142805A (en) * 2019-05-22 2019-08-20 武汉爱速达机器人科技有限公司 A kind of robot end's calibration method based on laser radar
CN110942477A (en) * 2019-11-21 2020-03-31 大连理工大学 Method for depth map fusion by using binocular camera and laser radar
CN111803842A (en) * 2020-08-08 2020-10-23 应急管理部上海消防研究所 Automatic aiming device of fire monitor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于双目视觉的目标运动参数高速实时测量方法研究;原崧育;《中国优秀硕士学位论文全文数据库》;20180228;全文 *
基于激光与视觉信息融合的运动目标检测关键技术研究;陈明;《中国优秀硕士学位论文全文数据库》;20200731;全文 *

Also Published As

Publication number Publication date
CN115546318A (en) 2022-12-30

Similar Documents

Publication Publication Date Title
CN115546318B (en) Automatic high-speed trajectory calibration method
CN110296691B (en) IMU calibration-fused binocular stereo vision measurement method and system
CN106934809B (en) Unmanned aerial vehicle aerial autonomous refueling rapid docking navigation method based on binocular vision
CN109559355B (en) Multi-camera global calibration device and method without public view field based on camera set
CN111563878B (en) Space target positioning method
CN107741175B (en) A kind of artificial intelligence fine sight method
CN107248178A (en) A kind of fisheye camera scaling method based on distortion parameter
CN109087355B (en) Monocular camera pose measuring device and method based on iterative updating
CN110889873A (en) Target positioning method and device, electronic equipment and storage medium
CN112949478A (en) Target detection method based on holder camera
US20220230348A1 (en) Method and apparatus for determining a three-dimensional position and pose of a fiducial marker
CN105306922A (en) Method and device for obtaining depth camera reference diagram
RU2695141C2 (en) Method of automatic adjustment of zero lines of sighting of optoelectronic channels of sighting of armored vehicles
Dolereit et al. Underwater stereo calibration utilizing virtual object points
TWI502162B (en) Twin image guiding-tracking shooting system and method
Dolereit et al. Converting underwater imaging into imaging in air
CN112556657B (en) Multi-view vision measurement system for flight motion parameters of separating body in vacuum environment
CN114241059B (en) Synchronous calibration method for camera and light source in photometric stereo vision system
CN113324538B (en) Cooperative target remote high-precision six-degree-of-freedom pose measurement method
CN111504255B (en) Three-dimensional alignment precision automatic measuring device and method based on machine vision
CN114415155B (en) Position calibration method for single-point laser range finder and visible light camera
CN112584041A (en) Image identification dynamic deviation rectifying method
CN112432594A (en) Machine vision six-degree-of-freedom measurement method based on physical decoupling
Li et al. Method for horizontal alignment deviation measurement using binocular camera without common target
Maddalena et al. Innovations on underwater stereoscopy: the new developments of the TV-Trackmeter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 35th Floor, Building A1, Phase I, Zhongan Chuanggu Science and Technology Park, No. 900, Wangjiang West Road, High-tech Zone, Hefei City, Anhui Province, 230000

Applicant after: Zhongke Xingtu Measurement and Control Technology Co.,Ltd.

Address before: 35th Floor, Building A1, Phase I, Zhongan Chuanggu Science and Technology Park, No. 900, Wangjiang West Road, High-tech Zone, Hefei City, Anhui Province, 230000

Applicant before: Zhongke Xingtu measurement and control technology (Hefei) Co.,Ltd.

GR01 Patent grant
GR01 Patent grant