CN109584308A - A kind of position calibration method based on space live-action map - Google Patents
A kind of position calibration method based on space live-action map Download PDFInfo
- Publication number
- CN109584308A CN109584308A CN201811326043.3A CN201811326043A CN109584308A CN 109584308 A CN109584308 A CN 109584308A CN 201811326043 A CN201811326043 A CN 201811326043A CN 109584308 A CN109584308 A CN 109584308A
- Authority
- CN
- China
- Prior art keywords
- point
- camera
- angle
- coordinate
- picture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
The present invention provides a kind of position calibration method based on space live-action map, includes the following: that step 1), picture are shown: obtaining the resolution ratio for working as preceding camera;It obtains video and shows that the width of picture is high;Step 2), calibration process: a point A is obtained at random on video pictures;Etc.;After camera is mobile, reacquire the above-mentioned value of camera, according to the angular deviation of camera, the corresponding three-dimensional coordinate O in image midpoint at this time is found out, with the dx of point A, dy, dz and camera origin O are linked to be straight line, M point is intersected at the space plane of video pictures, reverse derive finds out point M of the calibration point on video pictures, can re-scale coordinate of the original A point on picture.The present invention does not need a large amount of measurement stage properties before calibration, saves material resources cost and time cost.Greatly reduce the duty cycle of staking-out work.According to the difference of each camera, it is only necessary to adjust and optimize error coefficient, error amount can be reduced.
Description
Technical field
The present invention relates to big data technical fields, and in particular to a kind of position calibration method based on space live-action map.
Background technique
Currently, camera is because there are error, existing various calibration methods, for example demarcated by dual camera,
It is demarcated after calculating error by scaling board, heavy workload, the early-stage preparations period is long, and is not easy to implement, especially when taking the photograph
After head rotation rotation, positional shift Hui Geng great is demarcated.
Summary of the invention
The object of the present invention is to provide a kind of position calibration method based on space live-action map, do not need measurement object with
The distance of camera, after reducing error calibration object, even if working as camera rotation, calibration point can be still accurately positioned.
The present invention provides a kind of position calibration method based on space live-action map, includes the following steps:
Step 1), picture are shown:
1.1, the resolution ratio for working as preceding camera, RATIO_HEIGHT*RATIO_WIDTH are obtained;
1.2, it obtains video and shows that the width of picture is high, SCREEN_H*SCREEN_W;
Step 2), calibration process:
2.1, it obtains a point A at random on video pictures, obtains (X, Y) coordinate of point A;
2.2, according to coordinate, the coordinate (SX, SY) of corresponding pixel points is obtained;
2.3, the horizontal sextant angle of camera, HOR_ANGLE are obtained;
2.4, the vertical angle of camera, VER_ANGLE are obtained;
2.5, the horizontal-shift angle for working as preceding camera, HOR_OFFSET_ANGLE are obtained;
2.6, the vertical shift angle for working as preceding camera, VER_OFFSET_ANGLE are obtained;
2.7, the scaling multiple for working as preceding camera, SCALE_RATIO are obtained;
2.8, all on a Surface of Sphere, picture is shown in video all pictures through projection pattern, is calculated through
The three-dimensional coordinate of current plane central point defines three-dimensional coordinate, the direction y is positive downwards if focal length is 1;PI is in trigonometric function
Estimated value 3.1415926;
2.9, the corresponding vertical angle of current video central point, pointCenterY=sin (VER_OFFSET_ are found out
ANGLE*PI/180);
2.10, the corresponding vertical angle of current video central point, pointCenterX=cos (HOR_OFFSET_ are found out
ANGLE*PI/180);
2.11, assume that Z-direction amount and X, Y-axis intersect vertically, acquire calibration point in the coordinate of Z-direction, pointCenterZ=
fabs(pointCenterX*tan(fPanPos*PI/180));
2.12, assume that projection is wide high: videoScreenW=2*tan ((HOR_ANGLE/2) * PI/180.0),
VideoScreenH=2*tan ((VER_ANGLE/2) * PI/180.0);
2.13, using the central point of video pictures as coordinate, pointX=X-screenW/2, pointY=Y-screenH/
2;
2.14, calibration point is calculated in the estimation coordinate value of two-dimensional surface:
RealX=(1+m_factor* (pow (pointX, 2)+pow (pointY, 2))) * pointX;
RealY=(1+m_factor* (pow (pointX, 2)+pow (pointY, 2))) * pointY;
Wherein m_factor is error deviation coefficient, can be adjusted, m_factor is bigger, then offset is smaller;
2.15, according to estimated value, the distance of distance center point is obtained;
2.16, scaleWitdh=realX/SCREEN_W;
2.17, scaleHeight=realY/SCREEN_H;
2.18, the angle of the corresponding central point of point is obtained:
AngleW=scaleWitdh*videoScreenW;
AngleH=scaleHeight*videoScreenH;
2.19, mark point is set relative to the offset of center point coordinate as dx, dy, dz, can obtain following equation
Tan (fPanPos*PI/180)=dx/dz;
Dx*dx+dz*dz=angleW*angleW;
Cos (fTiltPos*PI/180)=dy/angleH;
2.20, simultaneous equations judge according to quadrant, can be derived that dx, dy, dz, then find out the point in three dimensions
Estimate coordinate;
2.21, after camera is mobile, the above-mentioned value for reacquiring camera is asked according to the angular deviation of camera
The corresponding three-dimensional coordinate O (centerX, centerY, centerZ) in image midpoint at this time out, with the dx of point A, dy, dz and camera shooting
Head origin O (0,0,0) is linked to be straight line, intersects at M point with the space plane of video pictures, and reverse derive finds out calibration point in video
Point M (X1, Y1, Z1) on picture can re-scale coordinate of the original A point on picture.
Wherein, it is related to professional term in the present invention and does description below explanation:
1, resolution ratio: the pixel for the maximum image size that camera can be supported, such as 640x480 (general clear),
800x600,1280x720 (high definition), 1920x1080 (full HD or super clear) refer to lateral and longitudinal pixel number.
2, frame per second: the maximum video capture ability that camera can be supported under maximum resolution, generally 15-25 are per second
(FPS)。
3, object distance: the distance of object distance video camera.
4, focal length: focal length is the distance between camera lens and photosensitive element, by the focal length for changing camera lens, thus it is possible to vary camera lens
Amplification factor, change shooting image size.Amplification factor ≈ focal length/object distance of camera lens.
5, field angle: reflect the coverage of picture.When focal length is fixed, field angle is smaller, is formed on photosensitive element
Picture range is smaller;Conversely, field angle is bigger, the picture range formed on photosensitive element is bigger.Horizontal and vertical field angle
Angular dimension may not be identical.
Compared with prior art, beneficial effects of the present invention are as follows:
1, a large amount of measurement stage properties before not needing calibration, save material resources cost and time cost.I.e. with prior art phase
Than greatly reducing the duty cycle of staking-out work.
2, according to the difference of each camera, it is only necessary to adjust and optimize error coefficient, error amount can be reduced.
Detailed description of the invention
Fig. 1 is a kind of functional sequence schematic diagram of the position calibration method based on space live-action map of the present invention.
Fig. 2 is the relation schematic diagram of the pixel of an entity and camera imaging picture in the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
Fig. 1~2 are please referred to, the present invention provides a kind of position calibration method based on space live-action map, including walks as follows
It is rapid:
Step (1), picture are shown:
1.1, the resolution ratio for working as preceding camera, RATIO_HEIGHT*RATIO_WIDTH are obtained;
1.2, it obtains video and shows that the width of picture is high, SCREEN_HEIGHT*SCREEN_WIDTH;
Step (2), calibration process:
2.1, it obtains a point A at random on video pictures, obtains (X, the Y) coordinate of point A on picture;
2.2, according to coordinate, the coordinate (SX, SY) of corresponding pixel points is obtained;
2.3, the horizontal sextant angle of camera, HOR_ANGLE are obtained;
2.4, the vertical angle of camera, VER_ANGLE are obtained;
2.5, the horizontal-shift angle for working as preceding camera, HOR_OFFSET_ANGLE are obtained;
2.6, the vertical deviation angle for working as preceding camera, VER_OFFSET_ANGLE are obtained;
2.7, the scaling multiple for working as preceding camera, SCALE_RATIO, such as Fig. 1 are obtained;
2.8, all on a Surface of Sphere, picture is shown in video all pictures through projection pattern, is calculated through
The three-dimensional coordinate of current plane central point defines three-dimensional coordinate, the direction y is positive downwards if focal length is 1;PI is in trigonometric function
Estimated value 3.1415926;
2.9, the corresponding vertical angle of current video central point, pointCenterY=sin (VER_OFFSET_ are found out
ANGLE*PI/180);
2.10, the corresponding vertical angle of current video central point, pointCenterX=cos (HOR_OFFSET_ are found out
ANGLE*PI/180);
2.11, assume that Z-direction amount and X, Y-axis intersect vertically, acquire calibration point in the coordinate of Z-direction, pointCenterZ=
fabs(pointCenterX*tan(fPanPos*PI/180));
2.12, assume that projection is wide high: videoScreenW=2*tan ((HOR_ANGLE/2) * PI/180.0),
VideoScreenH=2*tan ((VER_ANGLE/2) * PI/180.0);
2.13, using the central point of video pictures as coordinate, pointX=X-screenW/2, pointY=Y-screenH/
2;
2.14, calibration point is calculated in the estimation coordinate value of two-dimensional surface:
RealX=(1+m_factor* (pow (pointX, 2)+pow (pointY, 2))) * pointX;
RealY=(1+m_factor* (pow (pointX, 2)+pow (pointY, 2))) * pointY;
Wherein m_factor is error deviation coefficient, can be adjusted, m_factor is bigger, then offset is smaller, Neng Gougen
Factually the calibrated result in border optimizes.
2.15, according to estimated value, the distance of distance center point is obtained;
2.16, scaleWitdh=realX/SCREEN_W;
2.17, scaleHeight=realY/SCREEN_H;
2.18, the angle of the corresponding central point of point is obtained:
AngleW=scaleWitdh*videoScreenW;
AngleH=scaleHeight*videoScreenH;
2.19, mark point is set relative to the offset of center point coordinate as dx, dy, dz, can obtain following equation
Tan (fPanPos*PI/180)=dx/dz;
Dx*dx+dz*dz=angleW*angleW;
Cos (fTiltPos*PI/180)=dy/angleH;
2.20, simultaneous equations judge according to quadrant, can be derived that dx, dy, dz, then find out the point in three dimensions
Estimate coordinate, such as Fig. 1;
2.21, after camera is mobile, the above-mentioned value for reacquiring camera is asked according to the angular deviation of camera
The corresponding three-dimensional coordinate O (centerX, centerY, centerZ) in image midpoint at this time out, with the dx of point A, dy, dz and camera shooting
Head origin O (0,0,0) is linked to be straight line, intersects at M point with the space plane of video pictures, and reverse derive finds out calibration point in video
Point M (X1, Y1, Z1) on picture can re-scale coordinate of the original A point on picture, such as Fig. 2.
Finally, it should be noted that the foregoing is only a preferred embodiment of the present invention, it is not intended to restrict the invention,
Although the present invention is described in detail referring to the foregoing embodiments, for those skilled in the art, still may be used
To modify the technical solutions described in the foregoing embodiments or equivalent replacement of some of the technical features.
All within the spirits and principles of the present invention, any modification, equivalent replacement, improvement and so on should be included in of the invention
Within protection scope.
Claims (1)
1. a kind of position calibration method based on space live-action map, which comprises the steps of:
Step (1), picture are shown:
1.1, the resolution ratio for working as preceding camera, RATIO_HEIGHT*RATIO_WIDTH are obtained;
1.2, it obtains video and shows that the width of picture is high, SCREEN_H*SCREEN_W;
Step (2), calibration process:
2.1, it obtains a point A at random on video pictures, obtains (X, Y) coordinate of point A;
2.2, according to coordinate, the coordinate (SX, SY) of corresponding pixel points is obtained;
2.3, the horizontal sextant angle of camera, HOR_ANGLE are obtained;
2.4, the vertical angle of camera, VER_ANGLE are obtained;
2.5, the horizontal-shift angle for working as preceding camera, HOR_OFFSET_ANGLE are obtained;
2.6, the vertical shift angle for working as preceding camera, VER_OFFSET_ANGLE are obtained;
2.7, the scaling multiple for working as preceding camera, SCALE_RATIO are obtained;
2.8, all on a Surface of Sphere, picture is shown in video all pictures through projection pattern, is calculated through current
The three-dimensional coordinate of planar central point defines three-dimensional coordinate, the direction y is positive downwards if focal length is 1;PI is estimating in trigonometric function
Calculation value 3.1415926;
2.9, the corresponding vertical angle of current video central point, pointCenterY=sin (VER_OFFSET_ANGLE* are found out
PI/180);
2.10, the corresponding vertical angle of current video central point, pointCenterX=cos (HOR_OFFSET_ANGLE* are found out
PI/180);
2.11, assume that Z-direction amount and X, Y-axis intersect vertically, acquire calibration point in the coordinate of Z-direction, pointCenterZ=fabs
(pointCenterX*tan(fPanPos*PI/180));
2.12, assume that projection is wide high: videoScreenW=2*tan ((HOR_ANGLE/2) * PI/180.0),
VideoScreenH=2*tan ((VER_ANGLE/2) * PI/180.0);
2.13, using the central point of video pictures as coordinate, pointX=X-screenW/2, pointY=Y-screenH/2;
2.14, calibration point is calculated in the estimation coordinate value of two-dimensional surface:
RealX=(1+m_factor* (pow (pointX, 2)+pow (pointY, 2))) * pointX;
RealY=(1+m_factor* (pow (pointX, 2)+pow (pointY, 2))) * pointY;
Wherein m_factor is error deviation coefficient, can be adjusted, m_factor is bigger, then offset is smaller;
2.15, according to estimated value, the distance of distance center point is obtained;
2.16, scaleWitdh=realX/SCREEN_W;
2.17, scaleHeight=realY/SCREEN_H;
2.18, the angle of the corresponding central point of point is obtained:
AngleW=scaleWitdh*videoScreenW;
AngleH=scaleHeight*videoScreenH;
2.19, mark point is set relative to the offset of center point coordinate as dx, dy, dz, can obtain following equation
Tan (fPanPos*PI/180)=dx/dz;
Dx*dx+dz*dz=angleW*angleW;
Cos (fTiltPos*PI/180)=dy/angleH;
2.20, simultaneous equations judge according to quadrant, can be derived that dx, dy, dz, then find out the estimation of point in three dimensions
Coordinate;
2.21, after camera is mobile, the above-mentioned value of camera is reacquired, according to the angular deviation of camera, with dx,
Dy, dz and camera origin are linked to be straight line, intersect with the space plane of video pictures, and reverse derive finds out calibration point in video picture
Point M (X1, Y1, Z1) on face can re-scale coordinate of the original A point on picture.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811326043.3A CN109584308B (en) | 2018-11-08 | 2018-11-08 | Position calibration method based on space live-action map |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811326043.3A CN109584308B (en) | 2018-11-08 | 2018-11-08 | Position calibration method based on space live-action map |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109584308A true CN109584308A (en) | 2019-04-05 |
CN109584308B CN109584308B (en) | 2023-04-28 |
Family
ID=65921909
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811326043.3A Active CN109584308B (en) | 2018-11-08 | 2018-11-08 | Position calibration method based on space live-action map |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109584308B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110308741A (en) * | 2019-07-16 | 2019-10-08 | 杭州叙简科技股份有限公司 | A kind of multiple spot unmanned plane detecting system of defense and multiple spot unmanned plane detect striking method |
CN115375779A (en) * | 2022-10-27 | 2022-11-22 | 智广海联(天津)大数据技术有限公司 | Method and system for marking AR (augmented reality) real scene of camera |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009048516A (en) * | 2007-08-22 | 2009-03-05 | Sony Corp | Information processor, information processing method and computer program |
CN105991929A (en) * | 2016-06-21 | 2016-10-05 | 浩云科技股份有限公司 | Extrinsic parameter calibration and whole-space video stitching method for whole-space camera |
CN107862703A (en) * | 2017-10-31 | 2018-03-30 | 天津天地伟业信息系统集成有限公司 | A kind of more mesh linkage PTZ trackings |
-
2018
- 2018-11-08 CN CN201811326043.3A patent/CN109584308B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009048516A (en) * | 2007-08-22 | 2009-03-05 | Sony Corp | Information processor, information processing method and computer program |
CN105991929A (en) * | 2016-06-21 | 2016-10-05 | 浩云科技股份有限公司 | Extrinsic parameter calibration and whole-space video stitching method for whole-space camera |
CN107862703A (en) * | 2017-10-31 | 2018-03-30 | 天津天地伟业信息系统集成有限公司 | A kind of more mesh linkage PTZ trackings |
Non-Patent Citations (1)
Title |
---|
王子亨等: "摄像机非线性标定方法", 《计算机工程与设计》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110308741A (en) * | 2019-07-16 | 2019-10-08 | 杭州叙简科技股份有限公司 | A kind of multiple spot unmanned plane detecting system of defense and multiple spot unmanned plane detect striking method |
CN110308741B (en) * | 2019-07-16 | 2022-02-11 | 杭州叙简科技股份有限公司 | Multipoint unmanned aerial vehicle detection defense system and multipoint unmanned aerial vehicle detection striking method |
CN115375779A (en) * | 2022-10-27 | 2022-11-22 | 智广海联(天津)大数据技术有限公司 | Method and system for marking AR (augmented reality) real scene of camera |
Also Published As
Publication number | Publication date |
---|---|
CN109584308B (en) | 2023-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102580961B1 (en) | 3D modeling system and method based on photography, automatic 3D modeling device and method | |
CN105453136B (en) | The three-dimensional system for rolling correction, method and apparatus are carried out using automatic focus feedback | |
US8436904B2 (en) | Method and apparatus for calibrating video camera | |
Karpenko et al. | Digital video stabilization and rolling shutter correction using gyroscopes | |
US9172871B2 (en) | Method and device for multi-camera image correction | |
CN106097367B (en) | A kind of scaling method and device of binocular solid camera | |
US10825249B2 (en) | Method and device for blurring a virtual object in a video | |
US20150248744A1 (en) | Image processing device, image processing method, and information processing device | |
US20140125772A1 (en) | Image processing apparatus and method, image processing system and program | |
CN109920004A (en) | Image processing method, device, the combination of calibration object, terminal device and calibration system | |
JP2011506914A (en) | System and method for multi-frame surface measurement of object shape | |
WO2016155110A1 (en) | Method and system for correcting image perspective distortion | |
JP2014526823A (en) | Method and apparatus for improved cropping of stereoscopic image pairs | |
US20110091131A1 (en) | System and method for stabilization of fisheye video imagery | |
JPWO2005024723A1 (en) | Image composition system, image composition method and program | |
CN109584308A (en) | A kind of position calibration method based on space live-action map | |
CN114727081A (en) | Projector projection correction method and device and projector | |
TW201443827A (en) | Camera image calibrating system and method of calibrating camera image | |
CN111131801B (en) | Projector correction system and method and projector | |
CN113048888A (en) | Binocular vision-based remote three-dimensional displacement measurement method and system | |
JP6178127B2 (en) | Building measuring apparatus and measuring method | |
KR20060125148A (en) | Method for extracting 3-dimensional coordinate information from 3-dimensional image using mobile phone with multiple cameras and terminal thereof | |
CN111279685A (en) | Motion estimation | |
JP2005323021A (en) | In-vehicle imaging system and imaging method | |
TWI424259B (en) | Camera calibration method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |