CN107449419B - Full-parametric vision measurement method for continuous motion parameters of body target - Google Patents

Full-parametric vision measurement method for continuous motion parameters of body target Download PDF

Info

Publication number
CN107449419B
CN107449419B CN201710597416.XA CN201710597416A CN107449419B CN 107449419 B CN107449419 B CN 107449419B CN 201710597416 A CN201710597416 A CN 201710597416A CN 107449419 B CN107449419 B CN 107449419B
Authority
CN
China
Prior art keywords
coordinate system
equation
parameters
motion
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710597416.XA
Other languages
Chinese (zh)
Other versions
CN107449419A (en
Inventor
尚洋
裴宇
于起峰
张红良
徐志强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN201710597416.XA priority Critical patent/CN107449419B/en
Publication of CN107449419A publication Critical patent/CN107449419A/en
Application granted granted Critical
Publication of CN107449419B publication Critical patent/CN107449419B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a full-parametric vision measurement method for continuous motion parameters of a body target. And (3) fully parameterizing the motion parameters of the moving object target, establishing an overall solution equation, substituting the motion parameters of the fully parameterized moving object target and unknown structural parameters into the equation for solution, and finally obtaining the quantity to be solved. The method has high solving precision and can meet the requirement on the solving precision under most conditions. The method still has better performance and better stability of the designed algorithm under the condition of observation errors. The invention can be applied to the observation task of the moving monocular platform to the moving object target in the optical measurement of the target range, and has important theoretical research significance and wide application prospect.

Description

Full-parametric vision measurement method for continuous motion parameters of body target
Technical Field
The invention is mainly used for measuring the motion position and the gesture of a moving object target by a moving monocular platform, solves the structure and the motion parameters of the observed target by utilizing a design algorithm under the limitation that the azimuth information of the moving object target can only be observed through a monocular and the distance information cannot be obtained, and belongs to the field of camera measurement.
Background
In an actual measurement task, there is a problem that a maneuvering platform needs to perform three-dimensional observation on a moving object target monocular, for example, in applications such as a robot confrontation game and a robot football game, the robot needs to measure three-dimensional motion states of moving targets such as an opponent robot and a football in real time during a moving process, including three-dimensional trajectories, speeds, accelerations, attitude angles, angular velocities and the like including distance information along the axial direction of a camera and attitude information of the target itself; in the long-endurance flight experiment test, the motion parameters of the target need to be observed on a motion platform; in various applications such as space robots, underwater robots, and the like, there is also a need for robots to observe moving objects during movement. In weapon test range, optical measurement is widely applied to target external trajectory measurement.
At present, the optical measurement of a target range is mainly performed by a foundation measuring station, but with the development of a novel weapon, particularly a cruise missile, the range of the conventional ground optical measuring point is difficult to cover due to the characteristics of low flying height, long range and the like. At present, a single camera for recording live situations is installed on a target range measurement and control airplane, and if the movement information of a cruise bomb can be measured by using a target image shot by the single camera, the tracking measurement of a large-range target can be realized by using a light measurement method under the condition that a ground survey station is not added, and the target range test capability can be greatly improved under the condition that the cost can be saved.
In addition, measurement of non-cooperative targets is also a common problem in practical applications. For non-cooperative targets, the maximum characteristic is that the geometric size and structural features of the target are unknown, and the constraint relation among control points in general observation cannot be provided to help solve the motion parameters. If the maneuvering platform can be used for carrying out effective analysis on the target structure and movement in a monocular mode under the limited condition, the maneuvering platform can play an important role in military operations and has a great application prospect. The invention is suitable for solving the structural parameters and the position and posture information of a moving (non-cooperative) body target by a design algorithm under the monocular observation limit of a moving platform.
Disclosure of Invention
The technical problem to be solved by the invention is to solve the observation problem of a moving object target by a moving monocular platform, and to obtain the structural parameters, the moving position and the attitude parameters of the object target to be observed through a design algorithm under the condition that only the azimuth information of the moving object target in the space relative to the moving monocular platform can be obtained, but the distance information cannot be obtained. The input data of the algorithm is the azimuth angle of each characteristic point on each shooting moment object relative to the moving monocular in the space and the position information of the moving monocular. The parameters obtained by solving comprise the structural parameters, the motion position and the attitude parameters of the body target.
The technical scheme of the invention is to fully parameterize the motion parameters of the moving object, establish an overall solution equation, bring the motion parameters of the fully parameterized object and unknown structural parameters into the equation for solution, and finally obtain the quantity to be solved. The scheme mainly comprises the following three steps: the position and attitude information of the moving object target is fully parameterized and expressed by a time parameter polynomial. And then establishing an integral solving equation according to the space geometric relation through analysis. And finally, substituting the parameterized body target motion position and attitude parameters and unknown structural parameters into an integral equation to be solved to obtain the quantity to be measured.
The full-parametric vision measurement method of the continuous motion parameters of the body target comprises the following steps:
the space moving object to be measured is geometric invariant in general. Therefore, the objects to be studied here are geometric invariants, and also can be said to be the concept of rigid bodies in mechanics, that is, the relative positional relationship between the respective particles on the object is kept constant at any time during the movement. For any body target, under the premise of unchanged geometry, the position and the posture of the whole body target can be determined by only knowing the specific positions of three characteristic points on the body target at least. The basic idea of the method will be described below with three feature points on the volume target as the study object.
In the solution, we use this property of the volume object as a geometric invariant to establish equations and solve. In the solution, we first select a feature point as a base point for describing the motion position information of the body target, and then consider the change of the base point position of the body target and the change of the attitude angle around the base point, as shown in fig. 1, m is an artificially assumed base point, m1To m2Is changed in a short time, in the process, the target position change amount is delta r, and the pitch angle change is changed
Figure GDA0002303758100000021
(because of the planar representation, only pitch angle changes can be represented, in real space, with corresponding yaw angle changes Δ ψ and roll angle changes Δ γ).
After the application of the body target is decomposed into translation and rotation, six parameters can be used for describing the motion of the body target, namely XYZ coordinates (in a world coordinate system) of a base point and three attitude angles, namely a pitch angle, a yaw angle and a roll angle, of the body target. The six parameters are expressed by time parameter polynomial, and the quantity to be solved becomes the composition coefficient of the six polynomial. And solving the equation by establishing an overall equation describing the relationship between the motion of the body target and the motion of the camera and by using the known motion parameters of the camera and the observation sight line of the camera at each characteristic point to obtain the coefficient of each unknown polynomial. Thus, a polynomial describing all motion parameters of the body target is obtained, and further all motion information of the body target is obtained.
The following describes the algorithm by taking the case of studying 3 feature points as an example:
3.1 description of the problem
3.1.1 definition of feature points with respect to motion monocular azimuth
The orientation of the feature points in space relative to the camera optical center (i.e., the orientation of the viewing line in space) can be described by the pitch and azimuth angles in a spherical coordinate system (local coordinate system) with the camera optical center as the origin, assuming tiTime point target PiThe pitch and azimuth angles with respect to the optical center of the camera are α respectivelyiAnd βiAs shown in FIG. 2, wherein PiIs' PiProjection onto XY plane in local coordinate System Pitch αiIs CiPi' and CiPiAngle of inclination βiIs CiPi' angle to the local coordinate Y-axis.
3.1.2 input data and quantity to be solved
The known quantity is a motion track expression of a camera along with time
(XC=fXC(t) YC=fYC(t) ZC=fZC(t)) (3-1)
And any photographing time tiAzimuth angle of each feature point in space with respect to the camera
Figure GDA0002303758100000031
The quantity to be solved is the structural information of the volume target, namely the spatial distance information between each characteristic point on the volume target, and the spatial motion position and attitude information of the volume target.
3.2 Algorithm flow
The algorithm flow chart is shown in fig. 3.
3.2.1 coordinate System definition and transformation relationships
For the convenience of calculation, the volume target local coordinate system is defined as follows. As shown in FIG. 4, the feature point 1 is used as an origin, the 1-2 point ray is used as the X-axis direction, the Y-axis is in a plane determined by the 1-2-3 points, and the Z-axis is determined according to the coordinate axis relation of the right-hand rectangular coordinate system.
For the coordinate transformation relation, the rotation sequence of three attitude angles from a launching coordinate system to a projectile coordinate system in the reference flight mechanics is a pitch angle, a yaw angle and a roll angle, and the rotation sequence of three Euler angles from a world coordinate system to a local coordinate system is assumed to be rotation around a Y axis, rotation around a Z axis and rotation around an X axis. Assuming the coordinate of a certain point Q on the volume target in the local coordinate system as a column of vectors
Figure GDA0002303758100000032
For convenience of expression, three euler angles from a world coordinate system to a local coordinate system are called a pitch angle, a yaw angle and a roll angle respectively. Suppose that the pitch angle, yaw angle and roll angle at this time are respectively
Figure GDA0002303758100000033
Psi and gamma, and the coordinate column vector of the point Q in the world coordinate system is
Figure GDA0002303758100000034
Wherein, P1Is the coordinate of the local coordinate system origin (i.e. the characteristic point 1) in the world coordinate system, M is a direction cosine matrix, M isTDenotes the transpose of M, for a directional cosine matrix, the transpose and inverse are equal,specifically there are
Figure GDA0002303758100000041
Figure GDA0002303758100000042
Figure GDA0002303758100000043
The above is the coordinate transformation relationship from the local coordinate system to the world coordinate system. When it is necessary to convert the world coordinate system of a point on the volume target into local coordinate system coordinates, there is a point on the volume target according to equation (3-4)
Figure GDA0002303758100000044
Therefore, when the calculation needs, the position coordinates of any point can be expressed by the coordinates of the point in a world coordinate system or the coordinates of a local coordinate system, and the body object motion parameters can be conveniently solved.
3.2.2 expansion of coordinate transformation expressions
Since specific conversion expressions of XYZ three coordinate axes are required in building the integral solution equation, the coordinate conversion expressions need to be developed specifically here.
Assume that the distances between points 1-2, 2-3, and 3-1 are
(d12d23d31) (3-9)
According to the definition of the local coordinate system, the coordinates of the feature points 2 and 3 in the local coordinate system are
Figure GDA0002303758100000045
Wherein theta is the angle between the connecting lines 1-2 and 1-3 according to the geometrical relationship of plane triangle
Figure GDA0002303758100000046
The relation of converting the local coordinate system coordinates of the characteristic points 2 and the characteristic points 3 into the world coordinate system coordinates is
Figure GDA0002303758100000047
The above formula is unfolded to obtain
Figure GDA0002303758100000051
Figure GDA0002303758100000052
In the formula XPj、YPj、ZPjX, Y, Z axis coordinate values in the world coordinate system representing the feature point j are 1, 2, and 3, respectively.
The above equation is a coordinate transformation relation equation needed for establishing the overall solution equation.
3.2.3 establishing an integral equation
Before establishing an overall solution equation, firstly, 6 motion parameters of a description object are all expressed by a time parameter polynomial, and three parameters for describing position information and three parameters for describing attitude information are respectively assumed to be
Figure GDA0002303758100000053
Figure GDA0002303758100000054
For any feature point j in the 3 feature points, the equation in the sight-track intersection method is
Figure GDA0002303758100000055
6 equations for simultaneous 3 feature points, have
Figure GDA0002303758100000061
The coordinates of the feature point 1 can be expressed by a time parameter polynomial, and the coordinates of the feature point 2 and the feature point 3 can be expressed by the coordinates of the feature point 1 and the attitude angle expressed by the time parameter polynomial plus the structural parameter. In practice, a nonlinear system of equations is obtained with 6 sets of polynomial coefficients and 3 unknown structural parameters.
For each shooting instant tiAn equation set corresponding to the target state of the time can be obtained
Figure GDA0002303758100000062
The equation set is written in each shooting moment in sequence, the whole equation set of the overall solution algorithm can be obtained, and for data obtained by actual observation, each quantity to be solved can be obtained through numerical solution calculation, so that all the structure and motion parameters of the body target can be determined.
In the actual solving process, considering that all parameters of the overall solving equation are too complex, and observing the characteristics of the equation set, it can be seen that the coordinate solution of the feature point 1 can be obtained by independent solution at first (essentially, the feature point 1 is taken as a point target to be considered and solved separately under the condition of overall consideration), and the solution of the attitude angle parameter and the structure parameter at the back is not influenced. Therefore, in the specific calculation process, the solution can be performed in two steps in the numerical calculation, that is, the position parameter with the feature point 1 as the base point is solved first, and then the body target structure parameter and the motion attitude parameter are solved.
3.2.4 solution of volumetric target position parameters
It has been explained in the foregoing that the motion information of the volume target will be described with the feature point 1 as the base point in the present algorithm, and therefore, solving the spatial motion trajectory expressed by the time polynomial of the feature point 1 actually results in the motion position parameter of the volume target. For the solution of the position parameters, the first two terms of the system of equations (3-19) can be considered first, i.e.
Figure GDA0002303758100000063
And substituting the parameterized body target position parameter expression into an equation, and performing numerical solution. By solving the equation, the space motion track of the time polynomial expression of the characteristic point 1 can be obtained, and a specific expression is obtained
Figure GDA0002303758100000071
Up to this point, the body target movement position parameter with the feature point 1 as the base point has been obtained.
3.2.5 solution of structural and attitude parameters
After the solution of the position parameters is completed by solving the first two terms of the overall equation set, we have obtained the spatial motion trajectory of the temporal polynomial expression of the feature point 1. For the whole solution equation
Figure GDA0002303758100000072
Only the last four terms need be considered, i.e.
Figure GDA0002303758100000073
In the formula, the coordinate expressions of the characteristic point 2 and the characteristic point 3 are
Figure GDA0002303758100000074
Figure GDA0002303758100000075
At this time, the coordinate expression of the feature point 1 is known, and thus the amount to be solved is the distance between 1-2 points, 2-3 points, and 3-1 points
(d12d23d31) (3-26)
And coefficients of the expressions of the time parameters of the respective attitude angles
Figure GDA0002303758100000081
By substituting the expressions (3-16) of the attitude angles into the equations (3-23), the numerical solution of each quantity to be solved can be obtained by solving the equations through numerical calculation. Thereby obtaining the distance between each feature point and the time parameter expression of three attitude angles. Thereby determining the structural parameters and the motion attitude parameters of the body target. And adding the previously solved body target motion position parameters to obtain the structure information and motion position and attitude information required to be solved by the researched object body target.
3.3 discussion of known cases of volumetric target Structure
The above algorithm research object is a non-cooperative target, namely, the motion parameter solution under the condition that the body target structure is unknown. In actual observation, this is actually a simpler special case if the volumetric target structure to be solved is known. At this time, the structural parameters in the overall solution equation are changed from the originally unknown parameters to the known parameters. When the equation is solved, the structural parameters do not need to be solved, but the known quantity of the structural parameters is directly introduced into the equation, and the attitude parameters are directly solved.
The invention can achieve the following technical effects:
1. the method is suitable for optical measurement of a flying target in target range measurement;
2. the device can be only suitable for a movement monocular measuring platform to measure, and is simple;
3. only the azimuth information of the target is obtained through measurement, and the optical path is not required to be modified or other auxiliary equipment is not required to be used for measuring distance information;
4. the precision of the calculated motion and structure parameters is high, and the requirement of an observation task under a general condition is met;
5. the method has wide application range and wide application prospect, and can be widely applied to maneuvering observation tasks of the movement monocular platform;
6. the method needs mature hardware equipment (a camera), is low in cost (only needs shooting data), and has strong engineering practicability.
Drawings
FIG. 1 is a schematic diagram of the basic idea of the algorithm;
FIG. 2 is a schematic view of the definition of the observation azimuth;
FIG. 3 is a schematic flow chart of the algorithm;
FIG. 4 is a schematic diagram of a local coordinate system definition.
The specific implementation mode is as follows:
the invention provides a practical process of a full-parametric vision measurement method of a body target continuous motion parameter, which comprises the following steps:
1) designing a motion track of a motion monocular platform, and continuously shooting pictures of an observation object within a certain time by using the motion monocular platform under the designed motion track;
2) the orientation of each characteristic point on the body target in the space relative to the optical center of the moving monocular at each shooting moment is calculated according to the obtained shot picture;
3) and solving the structural parameters and the motion position attitude parameters of the body target to be solved by using the motion trajectory parameters of the motion monocular platform and the obtained azimuth information of each characteristic point of the body target as input data and using a full-parametric vision measurement method of continuous motion parameters of the body target to obtain the required information.

Claims (2)

1. The fully-parameterized visual measurement method for the continuous motion parameters of the body target is characterized by comprising the following three steps of:
step 1, firstly, fully parameterizing the position and attitude information of a moving body target, and expressing the position and attitude information by using a time parameter polynomial, wherein the specific contents are as follows:
the motion of the body target is decomposed into translation and rotation, and six parameters are used for describing the motion of the body target, namely XYZ coordinates of a base point and the body targetTarget three attitude angles, i.e. pitch angle
Figure FDA0002303758090000014
The yaw angle psi and the roll angle gamma are expressed by time parameter polynomials, and the quantity to be solved at this time is changed into the composition coefficients of the six polynomials, which specifically comprises the following steps:
3.2.1 coordinate System definition and transformation relationships
For the convenience of calculation, a local coordinate system of the volume target is defined as follows, a characteristic point 1 is taken as an origin, a 1-2 point ray is taken as the X-axis direction, the Y-axis is in a plane determined by 1-2-3 points, the Z-axis is determined according to the coordinate axis relation of a right-hand rectangular coordinate system,
for the coordinate conversion relation, the rotation sequence of three attitude angles from a launching coordinate system to a projectile coordinate system in the reference flight mechanics is a pitch angle, a yaw angle and a roll angle, the rotation sequence of three Euler angles from a world coordinate system to a local coordinate system is assumed to be rotation around a Y axis, rotation around a Z axis and rotation around an X axis, and the coordinate of a certain point Q on a target body in the local coordinate system is assumed to be a column of vector
Figure FDA0002303758090000011
For convenience of expression, three euler angles from the world coordinate system to the local coordinate system are called a pitch angle, a yaw angle and a roll angle, and the pitch angle, the yaw angle and the roll angle are assumed to be the pitch angle, the yaw angle and the roll angle at the moment
Figure FDA0002303758090000012
Psi and gamma, and the coordinate column vector of the point Q in the world coordinate system is
Figure FDA0002303758090000013
Wherein, P1Is the local coordinate system origin, i.e. the coordinate of the characteristic point 1 in the world coordinate system, M is the direction cosine matrix, M isTRepresenting the transposition of M, for directional cosinesA matrix, the transpose and inverse of which are equal, having
Figure FDA0002303758090000021
Figure FDA0002303758090000022
Figure FDA0002303758090000023
The above is the coordinate conversion relationship from the local coordinate system to the world coordinate system, and when it is necessary to convert the world coordinate system of one point on the body target into the local coordinate system, there is a relationship according to the equation (3-4)
Figure FDA0002303758090000024
Therefore, when the calculation needs, the position coordinate of any point can be expressed by the coordinate of the point in the world coordinate system or the coordinate of the local coordinate system, the motion parameter of the body object can be conveniently solved,
3.2.2 expansion of coordinate transformation expressions
Since specific conversion expressions of XYZ three coordinate axes are required in establishing the overall solution equation, the coordinate conversion expressions need to be developed specifically here,
assume that the distances between points 1-2, 2-3, and 3-1 are
(d12d23d31) (3-9)
According to the definition of the local coordinate system, the coordinates of the feature points 2 and 3 in the local coordinate system are
Figure FDA0002303758090000025
Wherein theta is the angle between the connecting lines 1-2 and 1-3 according to the geometrical relationship of plane triangle
Figure FDA0002303758090000026
The relation of converting the local coordinate system coordinates of the characteristic points 2 and the characteristic points 3 into the world coordinate system coordinates is
Figure FDA0002303758090000031
The above formula is unfolded to obtain
Figure FDA0002303758090000032
Figure FDA0002303758090000033
In the formula XPj、YPj、ZPjX, Y, Z axis coordinate values in the world coordinate system respectively representing the characteristic point j, j being 1, 2, 3,
the above formula is a coordinate conversion relational expression needed for establishing an integral solution equation;
step 2, establishing an integral solving equation according to the space geometric relationship through analysis;
the establishing of the integral equation specifically comprises the following steps:
before establishing an overall solution equation, firstly, 6 motion parameters of a description object are all expressed by a time parameter polynomial, and three parameters for describing position information and three parameters for describing attitude information are respectively assumed to be
Figure FDA0002303758090000034
Figure FDA0002303758090000035
For any feature point j in the 3 feature points, the equation in the sight-track intersection method is
Figure FDA0002303758090000036
6 equations for simultaneous 3 feature points, have
Figure FDA0002303758090000041
Wherein, the coordinate of the characteristic point 1 is expressed by a time parameter polynomial, the coordinates of the characteristic point 2 and the characteristic point 3 are expressed by the coordinates of the characteristic point 1, the attitude angle expressed by the time parameter polynomial and the structural parameter, a nonlinear equation set consisting of 6 sets of polynomial coefficients and 3 unknown structural parameters is obtained, and X in the equation isPj、YPj、ZPjX, Y, Z axis coordinate values in the world coordinate system respectively representing the feature point j, j being 1, 2, 3; αjiIndicates any photographing time tiThe pitch angle of each feature point in space with respect to the camera, βjiIndicates any photographing time tiAzimuth angle in space of each feature point relative to the camera, j is 1, 2, 3; xC=fXC(t),YC=fYC(t),ZC=fZC(t) representing a motion trajectory of the camera over time;
for each shooting instant tiObtaining an equation set corresponding to the target state of the time body
Figure FDA0002303758090000042
Writing the equation set in each shooting moment to obtain the whole equation set of the overall solution algorithm, obtaining each quantity to be solved through numerical solution calculation for data obtained by actual observation so as to determine all the structure and motion parameters of the body target,
in the actual solving process, considering that all parameters of the overall solving equation are too complex, observing the characteristics of an equation set, and seeing that the coordinate solution of the characteristic point 1 is obtained by independent solution at first without influencing the solution of the subsequent attitude angle parameter and the structural parameter, in the specific calculation process, the solution is divided into two steps in the numerical calculation, namely, the position parameter taking the characteristic point 1 as a base point is solved first, and then the body target structural parameter and the motion attitude parameter are solved;
and 3, finally, substituting the parameterized body target motion position and attitude parameters and unknown structural parameters into an integral equation to be solved to obtain the object to be measured.
2. The method for fully parametric vision measurement of the continuous motion parameters of the volumetric object according to claim 1, wherein the solution to be measured is as follows:
4.1) solution of volumetric target position parameters
In the algorithm, the characteristic point 1 is used as a base point to describe the motion information of the body target, so that the motion position parameters of the body target are actually obtained by solving the space motion track expressed by the time polynomial of the characteristic point 1, and for the solution of the position parameters, the first two terms of the equation set (3-19), namely
Figure FDA0002303758090000051
Substituting the parameterized body target position parameter expression into an equation, performing numerical solution, and solving the equation to obtain the space motion track of the time polynomial expression of the characteristic point 1, namely obtaining a specific expression
Figure FDA0002303758090000052
Thus, obtaining a body target motion position parameter with the characteristic point 1 as a base point;
4.2) solving of structural and attitude parameters
After the position parameters are solved by solving the first two terms of the whole equation set, the space motion trail expressed by the time polynomial of the characteristic point 1 is obtained, and the whole equation is solved
Figure FDA0002303758090000053
Only the last four terms need be considered, i.e.
Figure FDA0002303758090000061
In the formula, the coordinate expressions of the characteristic point 2 and the characteristic point 3 are
Figure FDA0002303758090000062
Figure FDA0002303758090000063
At this time, the coordinate expression of the feature point 1 is known, and thus the amount to be solved is the distance between 1-2 points, 2-3 points, and 3-1 points
(d12d23d31) (3-26)
And coefficients of the expressions of the time parameters of the respective attitude angles
Figure FDA0002303758090000064
The expression (3-16) of each attitude angle is substituted into the equation (3-23), the equation is solved through numerical calculation, the numerical solution of each quantity to be solved can be obtained, so that the distance between each characteristic point and the time parameter expression of three attitude angles are obtained, the structure parameter and the motion attitude parameter of the body target are determined, and the structure information and the motion position attitude information required to be solved by the body target of the researched object are obtained by adding the motion position parameter of the body target obtained through the previous solution.
CN201710597416.XA 2017-07-21 2017-07-21 Full-parametric vision measurement method for continuous motion parameters of body target Active CN107449419B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710597416.XA CN107449419B (en) 2017-07-21 2017-07-21 Full-parametric vision measurement method for continuous motion parameters of body target

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710597416.XA CN107449419B (en) 2017-07-21 2017-07-21 Full-parametric vision measurement method for continuous motion parameters of body target

Publications (2)

Publication Number Publication Date
CN107449419A CN107449419A (en) 2017-12-08
CN107449419B true CN107449419B (en) 2020-06-26

Family

ID=60488855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710597416.XA Active CN107449419B (en) 2017-07-21 2017-07-21 Full-parametric vision measurement method for continuous motion parameters of body target

Country Status (1)

Country Link
CN (1) CN107449419B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113405532B (en) * 2021-05-31 2022-05-06 中国农业大学 Forward intersection measuring method and system based on structural parameters of vision system
CN114022541B (en) * 2021-09-17 2024-06-04 中国人民解放军63875部队 Method for determining ambiguity correct solution of optical single-station gesture processing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5647015A (en) * 1991-12-11 1997-07-08 Texas Instruments Incorporated Method of inferring sensor attitude through multi-feature tracking
CN101907459A (en) * 2010-07-12 2010-12-08 清华大学 Monocular video based real-time posture estimation and distance measurement method for three-dimensional rigid body object
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7487063B2 (en) * 2003-06-13 2009-02-03 UNIVERSITé LAVAL Three-dimensional modeling from arbitrary three-dimensional curves

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5647015A (en) * 1991-12-11 1997-07-08 Texas Instruments Incorporated Method of inferring sensor attitude through multi-feature tracking
CN101907459A (en) * 2010-07-12 2010-12-08 清华大学 Monocular video based real-time posture estimation and distance measurement method for three-dimensional rigid body object
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于单目视觉的运动目标跟踪定位技术研究;姚楠;《中国博士学位论文全文数据库 信息科技辑》;20160115(第1期);2-10页 *
基于视觉的空间目标位置姿态测量方法研究;尚洋;《中国博士学位论文全文数据库 工程科技Ⅱ辑》;20080615(第6期);10-21页 *

Also Published As

Publication number Publication date
CN107449419A (en) 2017-12-08

Similar Documents

Publication Publication Date Title
Sazdovski et al. Inertial navigation aided by vision-based simultaneous localization and mapping
CN107727079A (en) The object localization method of camera is regarded under a kind of full strapdown of Small and micro-satellite
CN107449402A (en) A kind of measuring method of the relative pose of noncooperative target
Taylor et al. Comparison of two image and inertial sensor fusion techniques for navigation in unmapped environments
CN110285800A (en) A kind of the collaboration relative positioning method and system of aircraft cluster
CN107449419B (en) Full-parametric vision measurement method for continuous motion parameters of body target
CN103727937A (en) Star sensor based naval ship attitude determination method
Mejias et al. Omnidirectional bearing-only see-and-avoid for small aerial robots
CN112710303A (en) Method for determining attitude angle theta change of target in field of view caused by motion of motion platform
CN113076634B (en) Multi-machine cooperative passive positioning method, device and system
CN108106597B (en) Method for measuring angle of full strapdown laser seeker under condition of target linear field of view
CN111026139B (en) Three-dimensional model posture adjustment control method based on flight track
Lim et al. Pose estimation using a flash lidar
Corke et al. Sensing and control on the sphere
Cai et al. Multi-source information fusion augmented reality benefited decision-making for unmanned aerial vehicles: A effective way for accurate operation
CN108897029B (en) Non-cooperative target short-distance relative navigation vision measurement system index evaluation method
Qi et al. An Autonomous Pose Estimation Method of MAV Based on Monocular Camera and Visual Markers
CN109059866B (en) Method for measuring installation parameters of planet close-range photogrammetry camera based on image
Wang et al. Estimation of Relative Attitude Information of Cooperative Nodes in UAV Cluster
Yang et al. Inertial-aided vision-based localization and mapping in a riverine environment with reflection measurements
Guo et al. An EPnP Based Extended Kalman Filtering Approach for Docking Pose Estimation of AUVs
Li et al. The Coordinate System Design and Implementation of the Spacecraft's Attitude Simulation based on Five Axis Turntable
Xiaochen et al. Evaluation of Lucas-Kanade based optical flow algorithm
Capozzi et al. Daisy Chain Navigation and Communication in Underground Environments
Wenzhu et al. Head posture recognition method based on POSIT algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant