CN110673122A - Method for measuring target position data, shooting angle and camera view angle by monocular camera - Google Patents
Method for measuring target position data, shooting angle and camera view angle by monocular camera Download PDFInfo
- Publication number
- CN110673122A CN110673122A CN201910983336.7A CN201910983336A CN110673122A CN 110673122 A CN110673122 A CN 110673122A CN 201910983336 A CN201910983336 A CN 201910983336A CN 110673122 A CN110673122 A CN 110673122A
- Authority
- CN
- China
- Prior art keywords
- camera
- axis
- point
- angle
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
The invention discloses a method for measuring target position data, a shooting angle and a camera view angle by a monocular camera. The method uses a single video camera, can measure the distance between the video camera and the target on the premise of knowing the size of the target object, and can measure the relative position (components on an X axis, a Y axis and a Z axis in a specified three-dimensional coordinate system) of any point on the target and the video camera; the actual position of any point in the picture in the target object can be obtained according to the position of the point in the picture, or the position of any point in the target object in the picture can be obtained according to the position of the point in the target object; the shooting angle of the camera can be measured (the included angles between the shooting direction and the X axis, the Y axis and the Z axis respectively in a specified three-dimensional coordinate system); the angle of view or the focal length of the camera can be measured.
Description
Technical Field
The invention relates to a method for measuring target position data, shooting angle and camera visual angle by a monocular camera, belonging to the field of computer vision.
Background
The closest technique to the present invention is monocular distance measurement, which can measure the distance between a camera and a target with a single camera, knowing the actual size of the target. However, such monocular distance measurement has very great limitations: (1) if we call the direction of the actual size of the measurement target line segment A and the line between the camera and the target line segment B, then line segment B must be perpendicular to line segment A and line segment B must pass through the midpoint of line segment A. If this is not done, the measurement will be in error. (2) The viewing angle or focal length of the camera must be known in advance. In practice, there are generally two approaches to this: the first method is to take a picture in advance and measure the distance between the camera and the target, thus calculating the angle of view or the focal length of the camera; the second method is to directly read out the internal parameters of the camera. Both methods produce some error, however, in practice the error produced by the second method is usually larger than the first method.
In another aspect, the current monocular distance measurement technology has the following limitations: (1) when the above conditions are satisfied, the measurement may have a relatively large error. (in practice this error is probably around 2-5%). (2) In many applications we cannot satisfy these two requirements, such as the first one, which is too high if we require, for example, that the shooting direction must be perpendicular to the plane of the tennis court when we shoot a tennis court (whether or not it is required to pass through the center point of the court); the second requirement is also not met in many applications, such as real-time capture of a target object using zoom; in some cases, we do not even know what camera the picture was taken with.
Monocular ranging also has other greater limitations: monocular distance measurement can only measure the distance between the camera and the target, but in many applications, the target object is not a point (or a line), but a relatively large solid object, and we may need to obtain the relative position of any point on the object and the camera, or obtain the position of any point on the object in the picture, or determine the position of any point on the target object according to the position of any point in the picture, and for these requirements, the above monocular distance measurement method cannot be used; in many applications, it is necessary to know the shooting angle of the camera, or the size of the angle of view of the camera, which is important data for further analysis and calculation, and for these requirements, the monocular distance measuring method described above is also useless.
Disclosure of Invention
The invention relates to a new measuring method, which has the following two preconditions as the monocular distance measuring method: one is to know the actual size of the target, and the other is to use a single camera for measurement. However, the present method does not have the above-mentioned limitations of the monocular distance measuring method, that is: (1) the method can not only measure the distance between the camera and the target, but also measure the relative position (components on an X axis, a Y axis and a Z axis in a specified three-dimensional coordinate system) of any point on the target and the camera; the actual position of any point in the picture in the target object can be obtained according to the position of the point in the picture, or the position of any point in the target object in the picture can be obtained according to the position of the point in the target object; (2) with the method, the shooting angle of the camera is arbitrary, and the method can measure the shooting angle of the camera (the angle of rotation of the shooting direction around the X-axis, the Y-axis and the Z-axis respectively in a specified three-dimensional coordinate system); (3) the method does not need to know the visual angle or the focal length of the camera, and can measure the visual angle or the focal length of the camera;
Detailed Description
Without knowing the view angle or focal length of the camera, the method needs to know the relative position of 4 points on the target in advance, the 4 points must not have any 3 points collinear, and the following condition is satisfied: the 4 points are coplanar, or the 4 points are not coplanar, but the distance h between one point and the plane of the other 3 points is known; with the known view angle or focal length of the camera, the method requires that the relative positions of 3 points on the target are known in advance, and the 3 points must not be on a straight line. The steps in the first case will be described first, and then the steps in the second case will be described.
Under the condition that the visual angle or the focal length of the camera is unknown, the method comprises the following steps: taking the plane of 3 of the 4 points (if the 4 points are coplanar, the 4 th point is also on the plane), called the a-plane, we determine a two-dimensional coordinate system C2 on the a-plane: taking the coordinates of one of the 3 points as (0,0), the coordinates of the other 2 points as (x2,0), (x3, y3), the coordinates of the projection of the 4 th point on the A plane as (x4, y4), (if the 4 points are coplanar, (x4, y4) is the coordinates of the 4 th point). Of these coordinate values, X2 is a positive number and y3 and y4 are both negative, that is, the 2 nd point is on the positive direction of the X axis and the 3 rd and 4 th points are below the X axis. These coordinate values (x2, x3, y3, x4, y4) are all known in advance. We then determine a three-dimensional coordinate system C3: the shooting direction of the camera is taken as the negative direction of the Z axis of the coordinate system, the position of the camera is taken as the origin (0,0,0) of the coordinate system, the coordinates of the intersection point of the A plane and the Z axis are (0,0, -depth), and the depth is unknown. The directions of the X axis and the Y axis of the C3 coordinate system are arbitrarily determined on the premise of conforming to the right-hand rule (for example, the upward direction of the camera may be taken as the positive direction of the Y axis). In both the C2 and C3 coordinate systems, the length units are real-world length units (e.g., meters or millimeters).
Next we perform a series of translations and rotations of the a-plane to positions such that the coordinates of these 4 points in the C3 coordinate system are (0,0, -depth), (x2,0, -depth), (x3, y3, -depth), (x4, y4, -depth-deltaz4), respectively (deltaz 4-h or deltaz 4-h when the 4 th point is on the same side of the a-plane as the camera), where h is also known, and h is 0 and deltaz4 is 0 if the 4 points are coplanar. In this case, the A plane is parallel to the X-Y plane in the C3 coordinate system, the X axis of C2 is co-directional with the X axis of C3, and the Y axis of C2 is co-directional with the Y axis of C3. We add a dimension to the coordinates of these 4 points uniformly, and set their values to be uniform to 1, then we get 4 vectors of 4 dimensions: p1(0,0, -depth,1), P2(x2,0, -depth,1), P3(x3, y3, -depth,1), P4(x4, y4, -depth-deltaz4, 1).
Then is provided with
Let us order
M=T*U1*Rx*Ry*Rz*U2*P
Let the resolution of the camera be VW × VH, (VW and VH are the width and height pixel values of the image, respectively, and are known.) and let
Then order
Mr=Sp*Tp*Op
Then, a picture is taken by the camera, the shooting angle and the distance are not required, and only the above 4 points of the whole target object can be shot (in some cases, we do not even need to shoot the above 4 points, see below). We then used computer imaging techniques to obtain the position of the upper 4 points of the target object in the photograph, which are all some pixel values, given their values (px1, py1), (px2, py2), (px3, py3), (px4, py4), respectively.
If we cannot take all the above 4 points, but can calculate their positions outside the photograph, it is also feasible and does not affect our calculations. (for example, if a point is the intersection of two straight lines, the position of the point can be found if the two straight lines are known, and the point may not be in the picture.)
We add one dimension to these position vectors uniformly and set their value to 1, then we get 4 3-dimensional vectors: PX1(PX1, py1,1), PX2(PX2, py2,1), PX3(PX3, py3,1), PX4(PX4, py4, 1). We added a bias value factor (which is unknown) to py4, yielding new PX4(PX4, py4+ factor, 1).
Then make
P1P=P1*M
P2P=P2*M
P3P=P3*M
P4P=P4*M
PX1P=PX1*Mr
PX2P=PX2*Mr
PX3P=PX3*Mr
PX4P=PX4*Mr
Next, a system of 8 equations is obtained:
where P1P [1] is the first component of vector P1P, the rest being the same.
There are 8 unknowns in this system of equations: dx, dy, tx, ty, tz, depth, ta, factor, which is a nonlinear equation set, the solution of the equation set can be obtained by using the numerical solution of the nonlinear equation set.
Next we can get the shooting angle of the camera. If we establish a new three-dimensional coordinate system C3' based on the above C2 coordinate system: the origin of the new coordinate system is (-dx, -dy) of the C2 coordinate system, the X axis and the Y axis are the same as those of the C2 coordinate system, and the Z axis is established according to the right-hand rule. In this new coordinate system C3', the negative direction of the Z-axis is taken as the initial shooting direction of the camera, and the position of the camera is on the Z-axis with the coordinates (0,0, depth), where depth is one of the 8 unknowns we found above, which is a positive value. And rotating the shooting direction around an X axis, a Y axis and a Z axis to-tx, -ty, -tz respectively to obtain the real shooting direction of the camera.
The following is the angle of view of the camera, and if the angle of view of the camera is α, then α is related to ta:
ta=1/tan(α/2)
where tan is the tangent function.
After obtaining the above 8 parameters and the 10 matrix values determined by them, we can find the coordinates of a certain point in the above C3 coordinate system or C2 coordinate system if we know the position of the point in the photograph under the following 2 conditions.
Condition 1: the point to be solved is on the plane A;
condition 2: the point to be sought is not on the A-plane, but we know that the distance from this point to the A-plane is h.
Let the position of the point to be found in the photograph be (px, py) (px and py are both pixel values and are known). The vector VX is taken as (px, py, 1).
Let the coordinates of the candidate point in the C2 coordinate system be (x, y). And taking the vector V as (x, y, z,1), wherein the value of z is-depth under the condition 1, and under the condition 2, if the point to be solved is on the same side of the plane A as the camera, the value is-depth + h, otherwise, the value is-depth-h. Here both depth and h values are known, and x, y are the values we require.
Order to
VP=V*M
VXP=VX/Mr
The following system of equations can be obtained:
where VP [1] is the first component of the vector VP, and so on.
By solving this equation, the above-mentioned x, y values can be obtained, and the coordinates (x, y) are the coordinates of the point to be solved in the C2 coordinate system.
To determine the coordinates of the point to be determined in the C3 coordinate system, the method may be such that
V*T*U1*Rx*Ry*Rz*U2=VV
The vector V and the matrices T, U1, Rx, Ry, Rz, U2 are known, and the first 3 components of the vector VV are the coordinates of the point to be determined in the C3 coordinate system.
If we know the position of a point on the target object, that is, the values of x, y, z in the above vector V (x, y, z,1) are known, then its position on the photograph can be found by the following method:
order to
V*T*U1*Rx*Ry*Rz*U2*P=V2
Then is provided with
V3=(V2[1]/V2[4],V2[2]/V2[4],1)
Where V2[1] is the first component of vector V2, and so on.
Then order
V3*Mr=V4
The first two components of vector V4 are the location of the point in the photograph.
Given the angle of view or focal length of the camera, the steps of the method differ from the above only by:
(1) we only need to know the relative positions of 3 points on the target object, and these 3 points must not be on a straight line;
(2)4 vectors P1(0,0, -depth,1), P2(x2,0, -depth,1), P3(x3, y3, -depth,1), P4(x4, y4, -depth-deltaz4,1) with only the first 3, namely P1(0,0, -depth,1), P2(x2,0, -depth,1), P3(x3, y3, -depth, 1);
(3)4 vectors PX1(PX1, py1,1), PX2(PX2, py2,1), PX3(PX3, py3,1), PX4(PX4, py4+ factor,1) have only the first 3, namely PX1(PX1, py1,1), PX2(PX2, py2,1), PX3(PX3, py3, 1);
(4) thus, equation set 1 has only 6 equations:
(5) since the view angle of the camera is known, i.e. ta is known, there are only 6 unknowns in equation set 1: dx, dy, tx, ty, tz, depth.
Claims (1)
1. A monocular camera measures the target position data, method to shoot angle and camera visual angle, this method uses a single video camera, under the prerequisite of knowing the size of target object, can measure the distance between camera and target, can measure the relative position (in the component on X-axis, Y-axis and Z-axis in the three-dimensional coordinate system appointed) of arbitrary point and camera on the target; the actual position of any point in the picture in the target object can be obtained according to the position of the point in the picture, or the position of any point in the target object in the picture can be obtained according to the position of the point in the target object; the shooting angle of the camera can be measured (the included angles between the shooting direction and the X axis, the Y axis and the Z axis respectively in a specified three-dimensional coordinate system); the method is characterized in that the problem is solved by using numerical solution of linear algebra and nonlinear equation sets, and the method is embodied by using matrixes 1 to 10 and equation set 1 or any similar form of the matrixes and the equation sets.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910983336.7A CN110673122A (en) | 2019-10-16 | 2019-10-16 | Method for measuring target position data, shooting angle and camera view angle by monocular camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910983336.7A CN110673122A (en) | 2019-10-16 | 2019-10-16 | Method for measuring target position data, shooting angle and camera view angle by monocular camera |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110673122A true CN110673122A (en) | 2020-01-10 |
Family
ID=69082821
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910983336.7A Pending CN110673122A (en) | 2019-10-16 | 2019-10-16 | Method for measuring target position data, shooting angle and camera view angle by monocular camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110673122A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113101630A (en) * | 2021-04-08 | 2021-07-13 | 杨清平 | Method for measuring throwing distance of track and field throwing type project and long jumping distance of long jumping type project based on image processing |
CN116772730A (en) * | 2023-08-22 | 2023-09-19 | 成都睿铂科技有限责任公司 | Crack size measurement method, computer storage medium and system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5999615B2 (en) * | 2011-10-07 | 2016-09-28 | 国立研究開発法人情報通信研究機構 | Camera calibration information generating apparatus, camera calibration information generating method, and camera calibration information generating program |
CN109238235A (en) * | 2018-06-29 | 2019-01-18 | 华南农业大学 | Monocular sequence image realizes rigid body pose parameter continuity measurement method |
CN110057352A (en) * | 2018-01-19 | 2019-07-26 | 北京图森未来科技有限公司 | A kind of camera attitude angle determines method and device |
CN110209997A (en) * | 2019-06-10 | 2019-09-06 | 成都理工大学 | Depth camera automatic Calibration algorithm based on three-dimensional feature point |
-
2019
- 2019-10-16 CN CN201910983336.7A patent/CN110673122A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5999615B2 (en) * | 2011-10-07 | 2016-09-28 | 国立研究開発法人情報通信研究機構 | Camera calibration information generating apparatus, camera calibration information generating method, and camera calibration information generating program |
CN110057352A (en) * | 2018-01-19 | 2019-07-26 | 北京图森未来科技有限公司 | A kind of camera attitude angle determines method and device |
CN109238235A (en) * | 2018-06-29 | 2019-01-18 | 华南农业大学 | Monocular sequence image realizes rigid body pose parameter continuity measurement method |
CN110209997A (en) * | 2019-06-10 | 2019-09-06 | 成都理工大学 | Depth camera automatic Calibration algorithm based on three-dimensional feature point |
Non-Patent Citations (2)
Title |
---|
JASON CAMPBELL ETL.: "A Robust Visual Odometry and Precipice Detection System Using Consumer-grade Monocular Vision", 《PROCEEDINGS OF THE 2005 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION》 * |
于乃功 等: "基于单目视觉的机器人目标定位测距方法研究", 《计算机测量与控制》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113101630A (en) * | 2021-04-08 | 2021-07-13 | 杨清平 | Method for measuring throwing distance of track and field throwing type project and long jumping distance of long jumping type project based on image processing |
CN116772730A (en) * | 2023-08-22 | 2023-09-19 | 成都睿铂科技有限责任公司 | Crack size measurement method, computer storage medium and system |
CN116772730B (en) * | 2023-08-22 | 2023-11-10 | 成都睿铂科技有限责任公司 | Crack size measurement method, computer storage medium and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109559355B (en) | Multi-camera global calibration device and method without public view field based on camera set | |
Zhang et al. | A robust and rapid camera calibration method by one captured image | |
Orghidan et al. | Camera calibration using two or three vanishing points | |
CN111192235B (en) | Image measurement method based on monocular vision model and perspective transformation | |
CN107578450B (en) | Method and system for calibrating assembly error of panoramic camera | |
Liu et al. | External parameter calibration of widely distributed vision sensors with non-overlapping fields of view | |
CN110673122A (en) | Method for measuring target position data, shooting angle and camera view angle by monocular camera | |
Feng et al. | Inertial measurement unit aided extrinsic parameters calibration for stereo vision systems | |
CN113450416B (en) | TCSC method applied to three-dimensional calibration of three-dimensional camera | |
CN113048888A (en) | Binocular vision-based remote three-dimensional displacement measurement method and system | |
Wang et al. | An improved measurement model of binocular vision using geometrical approximation | |
CN104807405A (en) | Three-dimensional coordinate measurement method based on light ray angle calibration | |
Chen et al. | Calibration of stereo cameras with a marked-crossed fringe pattern | |
Feng et al. | Algorithm for epipolar geometry and correcting monocular stereo vision based on a plane mirror | |
CN104641395A (en) | Image processing device and image processing method | |
WO2018143153A1 (en) | Position measurement device and position measurement method | |
JP2022025818A (en) | Three-dimensional bar arrangement data creation method and three-dimensional bar arrangement data creation system for bar arrangement measurement | |
Yu et al. | An improved projector calibration method for structured-light 3D measurement systems | |
Fu et al. | A flexible approach to light pen calibration for a monocular-vision-based coordinate measuring system | |
Xue et al. | Complete calibration of a structure-uniform stereovision sensor with free-position planar pattern | |
JP7033294B2 (en) | Imaging system, imaging method | |
WO2020075213A1 (en) | Measurement apparatus, measurement method, and microscopic system | |
Li et al. | Method for horizontal alignment deviation measurement using binocular camera without common target | |
Wang et al. | Estimation of extrinsic parameters for dynamic binocular stereo vision using unknown-sized rectangle images | |
Huang et al. | A novel multi-camera calibration method based on flat refractive geometry |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20200110 |