CN109584308B - Position calibration method based on space live-action map - Google Patents
Position calibration method based on space live-action map Download PDFInfo
- Publication number
- CN109584308B CN109584308B CN201811326043.3A CN201811326043A CN109584308B CN 109584308 B CN109584308 B CN 109584308B CN 201811326043 A CN201811326043 A CN 201811326043A CN 109584308 B CN109584308 B CN 109584308B
- Authority
- CN
- China
- Prior art keywords
- point
- camera
- angle
- offset
- obtaining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
The invention provides a position calibration method based on a space live-action map, which comprises the following steps: step 1), picture display: acquiring the resolution of a current camera; acquiring the width and height of a video display picture; step 2), a calibration process: randomly acquiring a point A on a video picture; etc.; when the camera moves, the value of the camera is obtained again, the three-dimensional coordinate O corresponding to the midpoint of the image at the moment is obtained according to the angle offset of the camera, dx, dy and dz of the point A are connected with the origin O of the camera to form a straight line, the straight line intersects with the space plane of the video picture at the point M, the point M of the calibration point on the video picture is obtained by reverse deduction, and the coordinate of the original point A on the picture can be calibrated again. The invention does not need a large amount of measuring props before calibration, thereby saving material resource cost and time cost. The working period of calibration work is greatly reduced. According to the difference of each camera, only the error coefficient is required to be adjusted and optimized, and the error value can be reduced.
Description
Technical Field
The invention relates to the technical field of big data, in particular to a position calibration method based on a space live-action map.
Background
At present, the cameras have various calibration methods because of errors, such as calibration by double cameras, calibration is performed after the errors are calculated by the calibration plates, the workload is large, the preparation period in the early stage is long, and the implementation is not easy, particularly, when the cameras rotate and rotate, the calibration positions are deviated back to be larger.
Disclosure of Invention
The invention aims to provide a position calibration method based on a space live-action map, which does not need to measure the distance between an object and a camera, and can accurately position a calibration point even when the camera rotates after error calibration of the object is reduced.
The invention provides a position calibration method based on a space live-action map, which comprises the following steps:
step 1), picture display:
1.1, acquiring the resolution of a current camera, wherein the resolution_high is equal to the resolution_WIDTH;
1.2, obtaining the width and height of a video display picture, wherein SCREEN_H is SCREEN_W;
step 2), a calibration process:
2.1, randomly obtaining a point A on a video picture to obtain the (X, Y) coordinates of the point A;
2.2, obtaining coordinates (X, Y) of the corresponding pixel points according to the coordinates;
2.3, acquiring a horizontal included ANGLE of the camera, HOR_ANGLE;
2.4, acquiring a vertical included ANGLE of the camera, wherein the vertical included ANGLE is VER_ANGLE;
2.5, obtaining the horizontal OFFSET ANGLE of the current camera, HOR_OFFSET_ANGLE;
2.6, obtaining a vertical OFFSET included ANGLE of the current camera, wherein the vertical OFFSET included ANGLE is VER_OFFSET_ANGLE;
2.7, obtaining the scaling multiple of the current camera, namely SCALE_RATIO;
2.8, all pictures are on a sphere, the pictures are displayed in a video in a projection mode, a three-dimensional coordinate passing through the center point of the current plane is calculated, the focal length is set to be 1, the three-dimensional coordinate is defined, and the y direction is downwards positive; PI is an estimated value 3.1415926 in the trigonometric function;
2.9, obtaining a vertical ANGLE corresponding to the current video center point, wherein pointcenter y=sin (ver_offset_angle PI/180);
2.10, calculating a horizontal ANGLE corresponding to the current video center point, wherein pointcentrx=cos (hor_offset_angle PI/180);
2.11, the coordinate of the calibration point in the Z direction is obtained by intersecting the Z vector perpendicularly to the X and Y axes, and the pointCenterZ=
fabs(pointCenterX*tan(fPanPos*PI/180));
2.12, making the projection width and height: shadow sceren_w=2×tan ((hor_ang/2) PI/180.0), shadow sceren_h=2×tan ((ver_ang/2) PI/180.0);
2.13, calculating the relative coordinates of the random point from the center point, wherein pointx=x-screen_w/2, pointy=y-screen_h/2;
2.14, calculating an estimated coordinate value of the calibration point in the two-dimensional plane:
realX=(1+m_factor*(pow(pointX,2)+pow(pointY,2)))*pointX;
realY=(1+m_factor*(pow(pointX,2)+pow(pointY,2)))*pointY;
wherein m_factor is an error offset coefficient, and can be adjusted, and the larger the m_factor is, the smaller the offset is;
2.15, obtaining the distance from the point to the center point according to the estimated value;
2.16、scaleWitdh=realX/SCREEN_W;
2.17、scaleHeight=realY/SCREEN_H;
2.18, acquiring the angle of the center point corresponding to the point:
angleW=scaleWitdh*ShadowSCREEN_W;
angleH=scaleHeight*ShadowSCREEN_H;
2.19, the offset of the mark point relative to the center point coordinate is set as dx, dy, dz, the following equation can be obtained
tan(fPanPos*PI/180)=dx/dz;
dx*dx+dz*dz=angleW*angleW;
cos(fTiltPos*PI/180)=dy/angleH;
2.20, simultaneous equations, namely dx, dy and dz can be obtained according to quadrant judgment, and then the estimated coordinates of the point in the three-dimensional space are obtained;
2.21, when the camera moves, the value of the camera is obtained again, the three-dimensional coordinates O (center X, center Y, center Z) corresponding to the point in the image at the moment are obtained according to the angle offset of the camera, dx, dy, dz of the point A and the camera origin O (0, 0) are connected into a straight line, the straight line intersects with the spatial plane of the video picture at the point M, the point M (X1, Y1, Z1) of the calibration point on the video picture is obtained by reverse derivation, and the coordinates of the original point A on the picture can be calibrated again.
Wherein, the terms related in the invention are explained as follows:
1. resolution ratio: the pixel points with the maximum image size supported by the camera, such as 640x480 (plain definition), 800x600,1280x720 (high definition), 1920x1080 (full high definition or super definition), refer to the number of pixel points in the horizontal and vertical directions.
2. Frame rate: the highest video capture capability that a camera can support at maximum resolution is typically 15-25 per second (FPS).
3. Object distance: distance of the object from the camera.
4. Focal length: the focal length is the distance between the lens and the photosensitive element, and by changing the focal length of the lens, the magnification of the lens can be changed, and the size of the shot image can be changed. The magnification of the lens is approximately equal to the focal length/object distance.
5. Angle of view: reflecting the shooting range of the picture. When the focal length is fixed, the smaller the angle of view, the smaller the range of the picture formed on the photosensitive element; conversely, the larger the field angle, the larger the screen range formed on the photosensitive element. The angle magnitudes of the horizontal and vertical field of view may be different.
6. The horizontal included angle of the camera and the vertical included angle of the camera: the horizontal included angle and the vertical included angle of the view field of the camera.
7. Horizontal offset angle of camera: and an included angle of the initial horizontal position of the camera.
8. Vertical offset included angle of camera: the camera requires vertical installation.
9. fabs: the fabs function is a function that takes an absolute value.
10. pow: the pow function is the result of a power exponent, pow (x, y) being the power of x to y.
11. SCREEN_W and SCREEN_H: SCREEN_W, SCREEN_H is the horizontal and vertical pixels displayed.
12. fPanPos and fTiltPos: fPanPos, fTiltPos are each a horizontal inclination angle and a vertical inclination angle.
Compared with the prior art, the invention has the following beneficial effects:
1. and a large amount of measuring props before calibration are not needed, so that the material resource cost and the time cost are saved. Compared with the prior art, the working period of calibration work is greatly reduced.
2. According to the difference of each camera, only the error coefficient is required to be adjusted and optimized, and the error value can be reduced.
Drawings
Fig. 1 is a functional flow diagram of a position calibration method based on a space live-action map.
Fig. 2 is a schematic diagram of a relationship between a pixel point of an entity and an imaging frame of a camera according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1-2, the invention provides a position calibration method based on a space live-action map, which comprises the following steps:
step (1), displaying a picture:
1.1, acquiring the resolution of a current camera, wherein the resolution_high is equal to the resolution_WIDTH;
1.2, obtaining the WIDTH and HEIGHT of a video display picture, wherein SCREEN_HEIGHT is the same as SCREEN_WIDTH;
step (2), a calibration process:
2.1, randomly obtaining a point A on a video picture to obtain the (X, Y) coordinates of the point A on the picture;
2.2, obtaining coordinates (X, Y) of the corresponding pixel points according to the coordinates;
2.3, acquiring a horizontal included ANGLE of the camera, HOR_ANGLE;
2.4, acquiring a vertical included ANGLE of the camera, wherein the vertical included ANGLE is VER_ANGLE;
2.5, obtaining the horizontal OFFSET ANGLE of the current camera, HOR_OFFSET_ANGLE;
2.6, obtaining the vertical OFFSET ANGLE of the current camera, and VER_OFFSET_ANGLE;
2.7, obtaining the scaling multiple of the current camera, namely SCALE_RATIO, as shown in figure 1;
2.8, all pictures are on a sphere, the pictures are displayed in a video in a projection mode, a three-dimensional coordinate passing through the center point of the current plane is calculated, the focal length is set to be 1, the three-dimensional coordinate is defined, and the y direction is downwards positive; PI is an estimated value 3.1415926 in the trigonometric function;
2.9, obtaining a vertical ANGLE corresponding to the current video center point, wherein pointcenter y=sin (ver_offset_angle PI/180);
2.10, calculating a horizontal ANGLE corresponding to the current video center point, wherein pointcentrx=cos (hor_offset_angle PI/180);
2.11, perpendicularly intersecting the Z vector with the X, Y axes, obtaining coordinates of the calibration point in the Z direction, pointcentrz=fabs (pointcentrx×tan (fpanpos×pi/180));
2.12, making the projection width and height: shadow sceren_w=2×tan ((hor_ang/2) PI/180.0), shadow sceren_h=2×tan ((ver_ang/2) PI/180.0);
2.13, calculating the relative coordinates of the random point from the center point, wherein pointx=x-screen_w/2, pointy=y-screen_h/2;
2.14, calculating an estimated coordinate value of the calibration point in the two-dimensional plane:
realX=(1+m_factor*(pow(pointX,2)+pow(pointY,2)))*pointX;
realY=(1+m_factor*(pow(pointX,2)+pow(pointY,2)))*pointY;
the m_factor is an error offset coefficient, can be adjusted, and the larger the m_factor is, the smaller the offset is, and can be optimized according to the actual calibrated result.
2.15, obtaining the distance from the point to the center point according to the estimated value;
2.16、scaleWitdh=realX/SCREEN_W;
2.17、scaleHeight=realY/SCREEN_H;
2.18, acquiring the angle of the center point corresponding to the point:
angleW=scaleWitdh*ShadowSCREEN_W;
angleH=scaleHeight*ShadowSCREEN_H;
2.19, the offset of the mark point relative to the center point coordinate is set as dx, dy, dz, the following equation can be obtained
tan(fPanPos*PI/180)=dx/dz;
dx*dx+dz*dz=angleW*angleW;
cos(fTiltPos*PI/180)=dy/angleH;
2.20, simultaneous equations, according to quadrant judgment, dx, dy and dz can be obtained, and then the estimated coordinates of the point in the three-dimensional space are obtained, as shown in figure 1;
2.21, when the camera moves, the above value of the camera is obtained again, according to the angle offset of the camera, the three-dimensional coordinates O (center X, center Y, center Z) corresponding to the point in the image at this time are obtained, the dx, dy, dz of the point a is connected with the camera origin O (0, 0) to form a straight line, the straight line intersects with the spatial plane of the video frame at the point M, the point M (X1, Y1, Z1) of the calibration point on the video frame is obtained by reverse derivation, and the coordinates of the original point a on the frame can be calibrated again, as shown in fig. 2.
Finally, it should be noted that: the foregoing description is only a preferred embodiment of the present invention, and the present invention is not limited thereto, but it is to be understood that modifications and equivalents of some of the technical features described in the foregoing embodiments may be made by those skilled in the art, although the present invention has been described in detail with reference to the foregoing embodiments. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (1)
1. The position calibration method based on the space live-action map is characterized by comprising the following steps of:
step (1), displaying a picture:
1.1, acquiring the resolution of a current camera, wherein the resolution_high is equal to the resolution_WIDTH;
1.2, obtaining the width and height of a video display picture, wherein SCREEN_H is SCREEN_W;
step (2), a calibration process:
2.1, randomly obtaining a point A on a video picture to obtain the (X, Y) coordinates of the point A;
2.2, obtaining coordinates (X, Y) of the corresponding pixel points according to the coordinates;
2.3, acquiring a horizontal included ANGLE of the camera, HOR_ANGLE;
2.4, acquiring a vertical included ANGLE of the camera, wherein the vertical included ANGLE is VER_ANGLE;
2.5, obtaining the horizontal OFFSET ANGLE of the current camera, HOR_OFFSET_ANGLE;
2.6, obtaining a vertical OFFSET included ANGLE of the current camera, wherein the vertical OFFSET included ANGLE is VER_OFFSET_ANGLE;
2.7, obtaining the scaling multiple of the current camera, namely SCALE_RATIO;
2.8, all pictures are on a sphere, the pictures are displayed in a video in a projection mode, a three-dimensional coordinate passing through the center point of the current plane is calculated, the focal length is set to be 1, the three-dimensional coordinate is defined, and the y direction is downwards positive; PI is an estimated value 3.1415926 in the trigonometric function;
2.9, obtaining a vertical ANGLE corresponding to the current video center point, wherein pointcenter y=sin (ver_offset_angle PI/180);
2.10, calculating a horizontal ANGLE corresponding to the current video center point, wherein pointcentrx=cos (hor_offset_angle PI/180);
2.11, perpendicularly intersecting the Z vector with the X, Y axes, obtaining coordinates of the calibration point in the Z direction, pointcentrz=fabs (pointcentrx×tan (fpanpos×pi/180));
2.12, making the projection width and height: shadow sceren_w=2×tan ((hor_ang/2) PI/180.0) shadow sceren_h=2×tan ((ver_ang/2) PI/180.0);
2.13, calculating the relative coordinates of the random point from the center point, wherein pointx=x-screen_w/2, pointy=y-screen_h/2;
2.14, calculating an estimated coordinate value of the calibration point in the two-dimensional plane:
realX=(1+m_factor*(pow(pointX,2)+pow(pointY,2)))*pointX;
realY=(1+m_factor*(pow(pointX,2)+pow(pointY,2)))*pointY;
wherein m_factor is an error offset coefficient, and can be adjusted, and the larger the m_factor is, the smaller the offset is;
2.15, obtaining the distance from the point to the center point according to the estimated value;
2.16、scaleWitdh=realX/SCREEN_W;
2.17、scaleHeight=realY/SCREEN_H;
2.18, acquiring the angle of the center point corresponding to the point:
angleW=scaleWitdh*ShadowSCREEN_W;
angleH=scaleHeight*ShadowSCREEN_H;
2.19, the offset of the mark point relative to the center point coordinate is set as dx, dy, dz, the following equation can be obtained
tan(fPanPos*PI/180)=dx/dz;
dx*dx+dz*dz=angleW*angleW;
cos(fTiltPos*PI/180)=dy/angleH;
2.20, simultaneous equations, namely dx, dy and dz can be obtained according to quadrant judgment, and then the estimated coordinates of the point in the three-dimensional space are obtained;
2.21, when the camera moves, the value of the camera is obtained again, and according to the angle offset of the camera, dx, dy and dz are used for connecting with the origin of the camera to form a straight line, intersecting with the space plane of the video picture, and reversely deducing and obtaining the point M (X1, Y1 and Z1) of the calibration point on the video picture, thus the coordinate of the original point A on the picture can be calibrated again.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811326043.3A CN109584308B (en) | 2018-11-08 | 2018-11-08 | Position calibration method based on space live-action map |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811326043.3A CN109584308B (en) | 2018-11-08 | 2018-11-08 | Position calibration method based on space live-action map |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109584308A CN109584308A (en) | 2019-04-05 |
CN109584308B true CN109584308B (en) | 2023-04-28 |
Family
ID=65921909
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811326043.3A Active CN109584308B (en) | 2018-11-08 | 2018-11-08 | Position calibration method based on space live-action map |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109584308B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110308741B (en) * | 2019-07-16 | 2022-02-11 | 杭州叙简科技股份有限公司 | Multipoint unmanned aerial vehicle detection defense system and multipoint unmanned aerial vehicle detection striking method |
CN115375779B (en) * | 2022-10-27 | 2023-01-10 | 智广海联(天津)大数据技术有限公司 | Method and system for camera AR live-action annotation |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009048516A (en) * | 2007-08-22 | 2009-03-05 | Sony Corp | Information processor, information processing method and computer program |
CN105991929A (en) * | 2016-06-21 | 2016-10-05 | 浩云科技股份有限公司 | Extrinsic parameter calibration and whole-space video stitching method for whole-space camera |
CN107862703A (en) * | 2017-10-31 | 2018-03-30 | 天津天地伟业信息系统集成有限公司 | A kind of more mesh linkage PTZ trackings |
-
2018
- 2018-11-08 CN CN201811326043.3A patent/CN109584308B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009048516A (en) * | 2007-08-22 | 2009-03-05 | Sony Corp | Information processor, information processing method and computer program |
CN105991929A (en) * | 2016-06-21 | 2016-10-05 | 浩云科技股份有限公司 | Extrinsic parameter calibration and whole-space video stitching method for whole-space camera |
CN107862703A (en) * | 2017-10-31 | 2018-03-30 | 天津天地伟业信息系统集成有限公司 | A kind of more mesh linkage PTZ trackings |
Non-Patent Citations (1)
Title |
---|
摄像机非线性标定方法;王子亨等;《计算机工程与设计》;20100816(第15期);第194-197页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109584308A (en) | 2019-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9195121B2 (en) | Markerless geometric registration of multiple projectors on extruded surfaces using an uncalibrated camera | |
US9600859B2 (en) | Image processing device, image processing method, and information processing device | |
CN102984453B (en) | Single camera is utilized to generate the method and system of hemisphere full-view video image in real time | |
KR101521008B1 (en) | Correction method of distortion image obtained by using fisheye lens and image display system implementing thereof | |
CN111461994A (en) | Method for obtaining coordinate transformation matrix and positioning target in monitoring picture | |
CN106887023A (en) | For scaling board and its scaling method and calibration system that binocular camera is demarcated | |
US9946955B2 (en) | Image registration method | |
CN103839227B (en) | Fisheye image correcting method and device | |
CN101132535A (en) | Multi-projection large screen split-joint method based on rotating platform | |
EP3296952A1 (en) | Method and device for blurring a virtual object in a video | |
CN113538587A (en) | Camera coordinate transformation method, terminal and storage medium | |
CN104168467A (en) | Method for achieving projection display geometric correction by applying time series structure light technology | |
CN110807803B (en) | Camera positioning method, device, equipment and storage medium | |
CN109584308B (en) | Position calibration method based on space live-action map | |
CN114727081A (en) | Projector projection correction method and device and projector | |
Sagawa et al. | Calibration of lens distortion by structured-light scanning | |
KR20200003584A (en) | Method and apparatus for real-time correction of projector image using depth camera | |
CN111131801B (en) | Projector correction system and method and projector | |
KR100690172B1 (en) | method for extracting 3-dimensional coordinate information from 3-dimensional image using mobile phone with multiple cameras and terminal thereof | |
JP6178127B2 (en) | Building measuring apparatus and measuring method | |
CN104807405A (en) | Three-dimensional coordinate measurement method based on light ray angle calibration | |
US10606149B2 (en) | Information processing device, information processing method, and program | |
JP2011155412A (en) | Projection system and distortion correction method in the same | |
CN104363421B (en) | The method and apparatus for realizing Multi-angle camera monitoring effect | |
CN104539893B (en) | Realize the method, apparatus and system of virtual head monitoring effect |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |