CN110969663A - Static calibration method for external parameters of camera - Google Patents

Static calibration method for external parameters of camera Download PDF

Info

Publication number
CN110969663A
CN110969663A CN201811154021.3A CN201811154021A CN110969663A CN 110969663 A CN110969663 A CN 110969663A CN 201811154021 A CN201811154021 A CN 201811154021A CN 110969663 A CN110969663 A CN 110969663A
Authority
CN
China
Prior art keywords
camera
image
point
coordinates
imu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811154021.3A
Other languages
Chinese (zh)
Other versions
CN110969663B (en
Inventor
陈亮
李晓东
芦超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Momenta Technology Co Ltd
Original Assignee
Beijing Chusudu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Chusudu Technology Co ltd filed Critical Beijing Chusudu Technology Co ltd
Priority to CN201811154021.3A priority Critical patent/CN110969663B/en
Priority to PCT/CN2018/113667 priority patent/WO2020062434A1/en
Publication of CN110969663A publication Critical patent/CN110969663A/en
Application granted granted Critical
Publication of CN110969663B publication Critical patent/CN110969663B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Navigation (AREA)

Abstract

A static calibration method of external parameters of a camera; includes step S1: setting a plurality of mark points; step S2: converting the actual GPS position of the mark point to an IMU coordinate system with the IMU as the origin of coordinates to obtain a point P1; step S3: converting the P1 into a camera coordinate system with a camera as a coordinate origin through a rotation translation transformation matrix to obtain P2; step S4: converting P2 into an image coordinate system through an internal reference matrix of the camera device to obtain the projection coordinate of the point on the image; step S5: calculating a reprojection error function; step S6: optimizing the roto-translational transformation matrix, and repeating the steps S3-S5 until the reprojection error is below a specified threshold. The method constructs a loss function through the GPS coordinates of the mark points and the pixel coordinates of the mark points in the camera device to finally obtain the Euclidean transformation relation between the GPS coordinates and the pixel coordinates, namely a rotation translation transformation matrix. The method does not need to modify acquisition equipment or a vehicle, and can finish the calibration of the external parameters of the camera in a static state.

Description

Static calibration method for external parameters of camera
Technical Field
The invention relates to the field of intelligent driving, in particular to a static calibration method for external parameters of a camera.
Background
Currently, the accuracy of an autopilot map is measured in a GPS coordinate system, that is, each point on the map needs to be represented by a GPS coordinate. In a multi-sensor solution, the camera and GPS may not be mounted at the same time in the same location of the vehicle, but may be separated by a distance of 2-3 meters. Therefore, external parameters of the camera need to be calibrated to establish a spatial position relationship between the camera and the GPS module. If the external parameters of the camera are not calibrated, the image is directly constructed according to the image of the camera and the position of the vehicle body, and finally an error of two meters and three meters can be generated.
In the conventional calibration method, the camera and the IMU (the IMU is provided with the GPS module) need to be bound together very closely, and then the camera and the IMU are strongly shaken to excite the axis of the IMU and the axis of the camera to be attached together, so that the trajectory of the IMU and the trajectory of the camera are attached to each other, and the calibration of the camera is completed. However, the camera and IMU mounted on the vehicle cannot shake, so the conventional calibration method is not applicable.
In addition, in the existing technical scheme, a related research on how to adjust the transformation matrix by establishing a relation between the reprojection and the transformation matrix in the coordinate transformation process is lacked.
Disclosure of Invention
In view of the problems in the prior art, the invention adopts the following technical scheme:
a calibration method for external parameters of a camera is characterized by comprising the following steps:
step S1: setting one or more mark points, acquiring an image comprising the mark points through a camera device, and identifying the positions of the mark points in the image (u0, v 0);
step S2: converting the actual GPS position of the mark point to an IMU coordinate system with the IMU as the origin of coordinates to obtain a point P1;
step S3: converting P1 into a coordinate system with the camera as a coordinate origin through a rotation translation transformation matrix M [ R | t ] to obtain a point P2;
step S4: converting P2 into the coordinate system of the image through the internal reference matrix K of the camera device to obtain the projection coordinates (u1, v1) of the point on the image;
step S5: constructing a reprojection error function of (u0, v0) and (u1, v1), and calculating a reprojection error, wherein the reprojection error is related to the rotational-translational transformation matrix M [ R | t ];
step S6: optimizing the value of M [ R | t ], and repeating the steps S3-S5 until the reprojection error is lower than a specified threshold, wherein the value of M [ R | t ] is taken as a calibration result;
in the rotation and translation transformation matrix M [ R | t ], R represents rotation, and t represents translation.
Preferably, in step S1, the image capturing devices are disposed on the roof or around the vehicle.
Preferably, the camera device comprises a plurality of cameras, and images acquired by the cameras have an overlapping area.
Preferably, in step S1, the position (u0, v0) of the marker point in the image is found row by row and column by human or computer.
Preferably, in step S1, the positions (u0, v0) of the marker points in the image are obtained by a trained neural network.
Preferably, in step S5, the reprojection error function is as follows:
Figure BDA0001818570390000021
k is an internal reference matrix of the camera;
Rb,tb: the attitude and position of the IMU, respectively;
Figure BDA0001818570390000022
the GPS coordinate of the ith mark point;
Figure BDA0001818570390000023
the position of the ith marker point in the image.
Preferably, in step S6, the value of M [ R | t ] is optimized by the least squares method.
Preferably, the initial values of the rotation R and the translation t are both 0.
Preferably, in step S1, the number of the marked points is 10-20.
Preferably, in step S2, the actual GPS position of the marker may be measured in advance, or may be measured directly by using an RTK device during calibration.
The invention is characterized by the following aspects, but not limited to the following aspects:
the invention is characterized in that: (1) providing a static calibration method of external parameters of a camera; the calibration of the external parameters of the camera can be completed in a static state without modifying acquisition equipment or modifying a vehicle; the traditional method for marking points needs to bind a camera and an IMU (a GPS module is arranged on the IMU) together very closely and then shake violently, but when a vehicle is static, the traditional method cannot be used. This is one of the points of the present invention.
(2) A brand-new loss function is designed; the rotation and translation transformation matrix can be accurately and effectively solved through the function. This is mainly reflected in the following aspects: the loss function takes into account the attitude and position of the IMU and the GPS coordinates of the marked points; the loss function is associated with the roto-translational transformation matrix in the mathematical model, and the value of M [ R | t ] is adjusted as a result of the calibration by setting a specified threshold value for the reprojection error. This is one of the points of the present invention.
Drawings
FIG. 1 is a step diagram of the camera extrinsic parameter static calibration method of the present invention.
The present invention is described in further detail below. The following examples are merely illustrative of the present invention and do not represent or limit the scope of the claims, which are defined by the claims.
Detailed Description
The technical scheme of the invention is further explained by the specific implementation mode in combination with the attached drawings.
To better illustrate the invention and to facilitate the understanding of the technical solutions thereof, typical but non-limiting examples of the invention are as follows:
the three-dimensional map has good expression effect, has the characteristics of small data volume, high transmission speed, low cost and the like, and is widely applied to the fields of social management, public service, propaganda and display and the like through development for many years.
Generally, a three-dimensional map modeling worker completes three-dimensional modeling work by splicing images by taking a high-resolution satellite map or aerial photograph as a reference map according to data acquired by an acquisition worker; however, aerial photography or unmanned aerial vehicle shooting has many disadvantages, such as high cost and low resolution. The application provides a brand-new data acquisition mode of a three-dimensional map, and particularly relates to a method for acquiring data by installing a 360-degree omnibearing high-resolution camera device on a vehicle, wherein the camera device is installed on the roof of the vehicle, such as 4 high-resolution cameras around the vehicle, or the camera devices are respectively installed around the vehicle, the vehicle traverses each street crossing, simultaneously the camera device on the vehicle shoots high-resolution images as a reference map, and then the images are spliced to complete three-dimensional modeling work
Extrinsic parameters of the camera are parameters in the world coordinate system such as camera position, rotation direction, etc.; the camera external parameter calibration method is mapping from world coordinates to pixel coordinates, the world coordinates are considered to be defined, calibration is the world coordinates and the pixel coordinates of known calibration control points, the mapping relation is solved, and once the relation is solved, the world coordinates of the points can be reversely deduced from the pixel coordinates of the points.
The purpose of camera calibration is to determine the values of parameters of the camera by which the staff can map points in a three-dimensional space to image space, or vice versa.
The invention relates to a static calibration method, wherein a vehicle is static during calibration, and the position of a marker point is unchanged. The marking point may be a special marking board, or a marking rod. The GPS position of the marker is known, and the GPS position may be measured in advance, directly acquired during calibration, or directly measured using RTK equipment during calibration.
Generally, there are 6 external parameters of the camera, the rotation parameters of three axes are (ω, δ, θ), and then 3 × 3 rotation matrices of each axis are combined (i.e. multiplication between matrices is performed first) to obtain a set of three-axis rotation information R, the size of which is also 3 × 3; t (Tx, Ty, Tz).
Example 1
In this embodiment, a plurality of high resolution video cameras or cameras on the top of the vehicle can realize 360-degree high-definition video shooting or photo shooting, and the calibration method for the external parameters of the cameras in the state that the vehicle is stationary includes the following steps:
step S1: one or more marker points are set, an image including the marker points is acquired by a camera on the roof of the vehicle, and the positions of the marker points in the image are identified (u0, v 0). The position of the mark point in the image is an observed value, the mark point can be a special mark plate or a mark rod, and the acquired signal accuracy is high due to the fact that the mark plate occupies a small area and the GPS position is known; the GPS position of the marker is known, and the GPS position may be measured in advance, directly acquired during calibration, or directly measured using RTK equipment during calibration.
Because there are 6 external parameters of the camera, at least 6 mark points are needed to solve the corresponding parameters through the corresponding mathematical equation, generally, in actual operation, about 10 to 20 mark points are selected to reduce the influence of data noise on the solution, although the more mark points are more beneficial to the solution theoretically, the solution cost is increased; the coordinates of the mark points can be manually searched for corresponding coordinates row by row and column by column in an image shot by a camera, and can also be searched for by a computer.
Step S2: the actual GPS position of the marked point is converted to an IMU coordinate system with the IMU as the origin of coordinates, resulting in point P1. The longitude and latitude system of GPS is an ellipsoid system, not orthogonal, but the rotational-translational transformation matrix M R | t is based on an orthogonal coordinate system. Thus, converting GPS points to IMU coordinates means that longitude and latitude coordinates can be converted to cartesian coordinates (i.e., orthogonal coordinates). This conversion relationship is a standard conversion method within a GIS system. After transformation, the coordinates of the point are three-dimensional (X, Y, Z) coordinates representing the location of the marker point relative to the IMU. The above-mentioned rotation-translation transformation matrix M [ R | t ], R represents rotation, and t represents translation.
Step S3: the point P2 is obtained by transforming P1 into the camera coordinate system with the camera as the origin of coordinates through the rotation-translation transformation matrix M [ R | t ].
Step S4: the projection coordinates of the point on the image are obtained by converting P2 into the image coordinate system through the projection matrix of the camera (u1, v 1). The projection coordinates on the image are estimated values, the projection matrix is the camera's internal reference matrix, and the step is actually converting the three-dimensional point coordinates P2 into two-dimensional coordinates (u1, v1) of the point theoretically on the camera image by the camera's internal reference matrix.
Step S5: constructing a reprojection error function of (u0, v0) and (u1, v1), and calculating a reprojection error, wherein the reprojection error is related to M [ R | t ]; the reprojection error function is as follows:
Figure BDA0001818570390000041
k is an internal reference matrix of the camera;
Rb,tb: the attitude and position of the IMU, respectively;
Figure BDA0001818570390000051
the GPS coordinate of the ith mark point;
Figure BDA0001818570390000052
the position of the ith marker point in the image. This location is also an observation and may be identified manually.
Wherein
Figure BDA0001818570390000053
The coordinates (u1, v1) of the ith marker point are estimated values.
Step S6: optimizing the value of M [ R | t ] by a least square method, repeating the steps S3-S5, and continuously iterating the value of M [ R | t ] until the reprojection error is lower than a specified threshold value, wherein the value of M [ R | t ] is used as a calibration result. The method of optimizing the value of M [ R | t ] by the least square method is the prior art, and there are various ways to solve the value of M [ R | t ], for example, the value of M [ R | t ] optimization can be determined by derivation of a reprojection error function and the magnitude of the derivative; the initial value of M [ R | t ] may be 0 for simplicity, with a true size around the minus sixth power of 10.
Example 2
In this embodiment, a plurality of high resolution video cameras or cameras may be arranged around the vehicle, as long as 360-degree high-definition video shooting or photographing is implemented, and the calibration method for the external parameters of the cameras includes the following steps:
step S1: one or more marker points are set, an image including the marker points is acquired by cameras or cameras around the vehicle, and the positions of the marker points in the image are identified (u0, v 0). The position coordinates of the mark points in the image can be calculated through a trained neural network, and the neural network is trained through a large amount of existing data (namely the coordinates of the known mark points in the image are corresponding).
Step S2: the actual GPS position of the marked point is converted to an IMU coordinate system with the IMU as the origin of coordinates, resulting in point P1. Here, the GPS position and IMU coordinates are both three-dimensional coordinates, and the resulting point P1 is also a three-dimensional coordinate.
Step S3: converting P1 to a camera coordinate system with the camera as the origin of coordinates through a rotation translation transformation matrix M [ R | t ], and obtaining a point P2. Here also point P2 is a three-dimensional coordinate.
Step S4: the projection coordinates of the point on the image are obtained by converting P2 into the image coordinate system through the projection matrix of the camera (u1, v 1). The projection matrix is the camera's internal reference matrix, and this step is actually converting the three-dimensional point coordinate P2 into the two-dimensional coordinates (u1, v1) of the point theoretically on the camera image by the camera's internal reference matrix.
Step S5: constructing a reprojection error function of (u0, v0) and (u1, v1), and calculating a reprojection error, wherein the reprojection error is related to M [ R | t ]; the reprojection error function is as follows:
Figure BDA0001818570390000054
wherein K: an internal reference matrix of the camera;
Rb,tb: the attitude and position of the IMU, respectively;
Figure BDA0001818570390000061
the GPS coordinate of the ith mark point;
Figure BDA0001818570390000062
the position of the ith marker point in the image. This location is also an observation and may be identified manually.
Wherein
Figure BDA0001818570390000063
The coordinates (u1, v1) of the ith marker point are estimated values.
Step S6: and optimizing the value of M [ R | t ] by a least square method, and repeating the steps S3-S5 until the reprojection error is lower than a specified threshold value, wherein the value of M [ R | t ] is used as a calibration result.
The applicant declares that the present invention illustrates the detailed structural features of the present invention through the above embodiments, but the present invention is not limited to the above detailed structural features, that is, it does not mean that the present invention must be implemented depending on the above detailed structural features. It should be understood by those skilled in the art that any modifications of the present invention, equivalent substitutions of selected components of the present invention, additions of auxiliary components, selection of specific modes, etc., are within the scope and disclosure of the present invention.
The preferred embodiments of the present invention have been described in detail, however, the present invention is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present invention within the technical idea of the present invention, and these simple modifications are within the protective scope of the present invention.
It should be noted that the various technical features described in the above embodiments can be combined in any suitable manner without contradiction, and the invention is not described in any way for the possible combinations in order to avoid unnecessary repetition.
In addition, any combination of the various embodiments of the present invention is also possible, and the same should be considered as the disclosure of the present invention as long as it does not depart from the spirit of the present invention.

Claims (10)

1. A calibration method for external parameters of a camera is characterized by comprising the following steps:
step S1: setting one or more mark points, acquiring an image comprising the mark points through a camera device, and identifying the positions of the mark points in the image (u0, v 0);
step S2: converting the actual GPS position of the mark point to an IMU coordinate system with the IMU as the origin of coordinates to obtain a point P1;
step S3: converting P1 into a coordinate system with the camera as a coordinate origin through a rotation translation transformation matrix M [ R | t ] to obtain a point P2;
step S4: converting P2 into the coordinate system of the image through the internal reference matrix K of the camera device to obtain the projection coordinates (u1, v1) of the point on the image;
step S5: constructing a reprojection error function of (u0, v0) and (u1, v1), and calculating a reprojection error, wherein the reprojection error is related to the rotational-translational transformation matrix M [ R | t ];
step S6: optimizing the value of M [ R | t ], and repeating the steps S3-S5 until the reprojection error is lower than a specified threshold, wherein the value of M [ R | t ] is taken as a calibration result;
in the rotation and translation transformation matrix M [ R | t ], R represents rotation, and t represents translation.
2. The method of claim 1, wherein: in step S1, the image pickup devices are provided on the roof or around the vehicle.
3. The method according to claims 1-2, characterized in that: the camera device comprises a plurality of cameras, and images acquired by the cameras have an overlapping area.
4. A method according to claims 1-3, characterized in that: in step S1, the position (u0, v0) of the marker point in the image is found row by row and column by human or computer.
5. The method according to claims 1-4, characterized in that: in step S1, the positions (u0, v0) of the marker points in the image are obtained through a trained neural network.
6. The method of claim 1, wherein: in step S5, the reprojection error function is as follows:
Figure FDA0001818570380000011
wherein K: an internal reference matrix of the camera;
Rb,tb: the attitude and position of the IMU, respectively;
Figure FDA0001818570380000012
the GPS coordinate of the ith mark point;
Figure FDA0001818570380000021
the position of the ith marker point in the image.
7. The method according to claims 1-6, characterized in that: in step S6, the value of M [ R | t ] is optimized by the least square method.
8. The method according to claims 1-7, characterized in that: the initial values of the rotation R and the translation t are both 0.
9. The method according to claims 1-8, characterized in that: in step S1, the number of the marking points is 10-20.
10. The method according to claims 1-9, characterized in that: in step S2, the actual GPS position of the marker may be measured in advance, or may be measured directly by using an RTK device during calibration.
CN201811154021.3A 2018-09-30 2018-09-30 Static calibration method for external parameters of camera Active CN110969663B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811154021.3A CN110969663B (en) 2018-09-30 2018-09-30 Static calibration method for external parameters of camera
PCT/CN2018/113667 WO2020062434A1 (en) 2018-09-30 2018-11-02 Static calibration method for external parameters of camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811154021.3A CN110969663B (en) 2018-09-30 2018-09-30 Static calibration method for external parameters of camera

Publications (2)

Publication Number Publication Date
CN110969663A true CN110969663A (en) 2020-04-07
CN110969663B CN110969663B (en) 2023-10-03

Family

ID=69952814

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811154021.3A Active CN110969663B (en) 2018-09-30 2018-09-30 Static calibration method for external parameters of camera

Country Status (2)

Country Link
CN (1) CN110969663B (en)
WO (1) WO2020062434A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419420A (en) * 2020-09-17 2021-02-26 腾讯科技(深圳)有限公司 Camera calibration method and device, electronic equipment and storage medium
CN112611361A (en) * 2020-12-08 2021-04-06 华南理工大学 Method for measuring installation error of camera of airborne surveying and mapping pod of unmanned aerial vehicle
CN112802123A (en) * 2021-01-21 2021-05-14 北京科技大学设计研究院有限公司 Binocular linear array camera static calibration method based on stripe virtual target
CN112837381A (en) * 2021-02-09 2021-05-25 上海振华重工(集团)股份有限公司 Camera calibration method, system and equipment suitable for driving equipment

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112037285B (en) * 2020-08-24 2024-03-29 上海电力大学 Camera calibration method based on Levy flight and variation mechanism gray wolf optimization
CN112116529A (en) * 2020-09-23 2020-12-22 浙江浩腾电子科技股份有限公司 PTZ camera-based conversion method for GPS coordinates and pixel coordinates
CN112489111B (en) * 2020-11-25 2024-01-30 深圳地平线机器人科技有限公司 Camera external parameter calibration method and device and camera external parameter calibration system
CN112465920A (en) * 2020-12-08 2021-03-09 广州小鹏自动驾驶科技有限公司 Vision sensor calibration method and device
CN112767498A (en) * 2021-02-03 2021-05-07 苏州挚途科技有限公司 Camera calibration method and device and electronic equipment
CN113284193B (en) * 2021-06-22 2024-02-02 智道网联科技(北京)有限公司 Calibration method, device and equipment of RS equipment

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010045766A (en) * 2008-07-16 2010-02-25 Kodaira Associates Kk System for calibrating urban landscape image information
CN103985118A (en) * 2014-04-28 2014-08-13 无锡观智视觉科技有限公司 Parameter calibration method for cameras of vehicle-mounted all-round view system
CN104766058A (en) * 2015-03-31 2015-07-08 百度在线网络技术(北京)有限公司 Method and device for obtaining lane line
CN105930819A (en) * 2016-05-06 2016-09-07 西安交通大学 System for real-time identifying urban traffic lights based on single eye vision and GPS integrated navigation system
US20160320476A1 (en) * 2015-04-28 2016-11-03 Henri Johnson Systems to track a moving sports object
CN106803273A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 A kind of panoramic camera scaling method
US20170227361A1 (en) * 2014-06-20 2017-08-10 Uti Limited Partnership Mobile mapping system
CN107067437A (en) * 2016-12-28 2017-08-18 中国航天电子技术研究院 A kind of unmanned plane alignment system and method based on multiple view geometry and bundle adjustment
US20170243399A1 (en) * 2016-02-19 2017-08-24 The Boeing Company Methods for Localization Using Geotagged Photographs and Three-Dimensional Visualization
CN108230381A (en) * 2018-01-17 2018-06-29 华中科技大学 A kind of combination spatial and the multiple view Stereo Vision of Pixel-level optimization
CN108279428A (en) * 2017-01-05 2018-07-13 武汉四维图新科技有限公司 Map datum evaluating apparatus and system, data collecting system and collecting vehicle and acquisition base station
CN108288294A (en) * 2018-01-17 2018-07-17 视缘(上海)智能科技有限公司 A kind of outer ginseng scaling method of a 3D phases group of planes
CN108364252A (en) * 2018-01-12 2018-08-03 深圳市粒视界科技有限公司 A kind of correction of more fish eye lens panorama cameras and scaling method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8310539B2 (en) * 2009-05-29 2012-11-13 Mori Seiki Co., Ltd Calibration method and calibration device
CN103593836A (en) * 2012-08-14 2014-02-19 无锡维森智能传感技术有限公司 A Camera parameter calculating method and a method for determining vehicle body posture with cameras
CN103049912B (en) * 2012-12-21 2015-03-11 浙江大学 Random trihedron-based radar-camera system external parameter calibration method
CN103745452B (en) * 2013-11-26 2014-11-26 理光软件研究所(北京)有限公司 Camera external parameter assessment method and device, and camera external parameter calibration method and device
CN105547228B (en) * 2015-12-09 2018-10-09 中国航空工业集团公司西安飞机设计研究所 A kind of angle displacement measuring device

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010045766A (en) * 2008-07-16 2010-02-25 Kodaira Associates Kk System for calibrating urban landscape image information
CN103985118A (en) * 2014-04-28 2014-08-13 无锡观智视觉科技有限公司 Parameter calibration method for cameras of vehicle-mounted all-round view system
US20170227361A1 (en) * 2014-06-20 2017-08-10 Uti Limited Partnership Mobile mapping system
CN104766058A (en) * 2015-03-31 2015-07-08 百度在线网络技术(北京)有限公司 Method and device for obtaining lane line
US20160320476A1 (en) * 2015-04-28 2016-11-03 Henri Johnson Systems to track a moving sports object
US20170243399A1 (en) * 2016-02-19 2017-08-24 The Boeing Company Methods for Localization Using Geotagged Photographs and Three-Dimensional Visualization
CN105930819A (en) * 2016-05-06 2016-09-07 西安交通大学 System for real-time identifying urban traffic lights based on single eye vision and GPS integrated navigation system
CN107067437A (en) * 2016-12-28 2017-08-18 中国航天电子技术研究院 A kind of unmanned plane alignment system and method based on multiple view geometry and bundle adjustment
CN108279428A (en) * 2017-01-05 2018-07-13 武汉四维图新科技有限公司 Map datum evaluating apparatus and system, data collecting system and collecting vehicle and acquisition base station
CN106803273A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 A kind of panoramic camera scaling method
CN108364252A (en) * 2018-01-12 2018-08-03 深圳市粒视界科技有限公司 A kind of correction of more fish eye lens panorama cameras and scaling method
CN108230381A (en) * 2018-01-17 2018-06-29 华中科技大学 A kind of combination spatial and the multiple view Stereo Vision of Pixel-level optimization
CN108288294A (en) * 2018-01-17 2018-07-17 视缘(上海)智能科技有限公司 A kind of outer ginseng scaling method of a 3D phases group of planes

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
何国辉;黄文军;: "基于圆点阵列的多相机外部参数标定", no. 04 *
张勤;贾庆轩;席宁;姜勇;: "基于双视点特征匹配的激光-相机系统标定方法" *
张勤;贾庆轩;席宁;姜勇;: "基于双视点特征匹配的激光-相机系统标定方法", 仪器仪表学报, no. 11, pages 2623 - 2627 *
陈明伟;朱登明;毛天露;王兆其;: "基于二阶锥规划的运动控制相机轨迹跟踪研究", no. 08 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419420A (en) * 2020-09-17 2021-02-26 腾讯科技(深圳)有限公司 Camera calibration method and device, electronic equipment and storage medium
CN112611361A (en) * 2020-12-08 2021-04-06 华南理工大学 Method for measuring installation error of camera of airborne surveying and mapping pod of unmanned aerial vehicle
CN112802123A (en) * 2021-01-21 2021-05-14 北京科技大学设计研究院有限公司 Binocular linear array camera static calibration method based on stripe virtual target
CN112802123B (en) * 2021-01-21 2023-10-27 北京科技大学设计研究院有限公司 Binocular linear array camera static calibration method based on stripe virtual target
CN112837381A (en) * 2021-02-09 2021-05-25 上海振华重工(集团)股份有限公司 Camera calibration method, system and equipment suitable for driving equipment

Also Published As

Publication number Publication date
CN110969663B (en) 2023-10-03
WO2020062434A1 (en) 2020-04-02

Similar Documents

Publication Publication Date Title
CN110969663B (en) Static calibration method for external parameters of camera
CN112894832B (en) Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium
CN109523471B (en) Method, system and device for converting ground coordinates and wide-angle camera picture coordinates
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
CN104268935A (en) Feature-based airborne laser point cloud and image data fusion system and method
CN109146958B (en) Traffic sign space position measuring method based on two-dimensional image
CN111260539B (en) Fish eye pattern target identification method and system thereof
CN104240262A (en) Calibration device and calibration method for outer parameters of camera for photogrammetry
CN103310487B (en) A kind of universal imaging geometric model based on time variable generates method
CN115588040A (en) System and method for counting and positioning coordinates based on full-view imaging points
CN115439531A (en) Method and equipment for acquiring target space position information of target object
CN110736472A (en) indoor high-precision map representation method based on fusion of vehicle-mounted all-around images and millimeter wave radar
CN112902929A (en) Novel surveying and mapping method through unmanned aerial vehicle aerial survey
CN115439528B (en) Method and equipment for acquiring image position information of target object
CN115690612A (en) Unmanned aerial vehicle photoelectric image target search quantization indicating method, device and medium
CN114782548A (en) Global image-based radar vision data calibration method, device, equipment and medium
CN113415433B (en) Pod attitude correction method and device based on three-dimensional scene model and unmanned aerial vehicle
CN114979956A (en) Unmanned aerial vehicle aerial photography ground target positioning method and system
CN114659523A (en) Large-range high-precision attitude measurement method and device
CN114529585A (en) Mobile equipment autonomous positioning method based on depth vision and inertial measurement
CN112489118B (en) Method for quickly calibrating external parameters of airborne sensor group of unmanned aerial vehicle
CN114429515A (en) Point cloud map construction method, device and equipment
CN115112100A (en) Remote sensing control system and method
CN114170323A (en) Parameter calibration method and equipment for image shooting device and storage medium
CN110969664A (en) Dynamic calibration method for external parameters of camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220301

Address after: 100083 unit 501, block AB, Dongsheng building, No. 8, Zhongguancun East Road, Haidian District, Beijing

Applicant after: BEIJING MOMENTA TECHNOLOGY Co.,Ltd.

Address before: Room 28, 4 / F, block a, Dongsheng building, No. 8, Zhongguancun East Road, Haidian District, Beijing 100089

Applicant before: BEIJING CHUSUDU TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant