WO2020062434A1 - Static calibration method for external parameters of camera - Google Patents
Static calibration method for external parameters of camera Download PDFInfo
- Publication number
- WO2020062434A1 WO2020062434A1 PCT/CN2018/113667 CN2018113667W WO2020062434A1 WO 2020062434 A1 WO2020062434 A1 WO 2020062434A1 CN 2018113667 W CN2018113667 W CN 2018113667W WO 2020062434 A1 WO2020062434 A1 WO 2020062434A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- image
- point
- coordinates
- imu
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Definitions
- the invention relates to the field of intelligent driving, in particular to a static calibration method for external parameters of a camera.
- the accuracy of autonomous driving maps is measured in the GPS coordinate system, which means that each point on the map needs to be represented by GPS coordinates.
- the camera and GPS may not be installed at the same location of the vehicle at the same time, but may be separated by a distance of 2 to 3 meters. Therefore, the external parameters of the camera need to be calibrated to establish the spatial position relationship between the camera and the GPS module. If the external parameters of the camera are not calibrated, and the image is directly constructed based on the camera image and the position of the car body, an error of two or three meters may eventually occur.
- the camera and the IMU (the GPS module on the IMU) need to be tied closely together, and then shakes violently, exciting the axis of the IMU and the camera, so that the trajectory of the IMU and the camera's The trajectory fits to complete the calibration of the camera.
- the camera and IMU installed on the car cannot shake, so the traditional calibration method is not applicable.
- a method for calibrating external parameters of a camera includes the following steps:
- Step S1 Set one or more marked points, acquire an image including the marked points through a camera device, and identify the position (u0, v0) of the marked points in the image;
- Step S2 The actual GPS position of the marked point is converted to the IMU coordinate system with the IMU as the origin of the coordinate to obtain the point P1;
- Step S3 P1 is transformed into a coordinate system with the camera device as the coordinate origin by rotating the translation transformation matrix M [R
- Step S4 P2 is transformed into the coordinate system of the image by the internal parameter matrix K of the camera device itself to obtain the projection coordinates (u1, v1) of the point on the image;
- Step S5 construct re-projection error functions of (u0, v0) and (u1, v1), and calculate re-projection errors, which are related to the rotation translation transformation matrix M [R
- K the internal parameter matrix of the camera
- R b , t b respectively the attitude and position of the IMU
- Step S6 Optimize the value of M [R
- the camera device is disposed on the roof of the vehicle or around the vehicle.
- the camera includes a plurality of cameras, and there is an overlapping area between the images acquired by each camera.
- step S1 the position (u0, v0) of the marked point in the image is obtained by searching by rows or columns manually or by a computer.
- step S1 the position (u0, v0) of the marked point in the image is obtained through a trained neural network.
- step S5 the re-projection error function is as follows:
- K the internal parameter matrix of the camera
- R b , t b respectively the attitude and position of the IMU
- step S6 the value of M [R
- the initial values of the rotation R and the translation t both take 0.
- the number of the marked points is 10-20.
- step S2 the actual GPS position of the marked point is measured in advance, or directly measured by using an RTK device during calibration.
- the invention of the present invention lies in the following aspects, but is not limited to the following aspects:
- the invention is based on the following aspects: (1) a static calibration method for the external parameters of the camera is provided; the calibration of the external parameters of the camera can be completed in a stationary state without the need to modify the acquisition equipment or the vehicle;
- the traditional punctuation method requires the camera and the IMU (the GPS module on the IMU) be tied together very closely, and then shakes violently.
- the vehicle is stationary, the traditional method cannot be used.
- the present invention provides a method for marking points The GPS coordinates and pixel coordinates of the marker points in the camera device are used to construct a loss function to finally obtain the Euclidean transformation relationship between the two, that is, the rotation and translation transformation matrix. This method does not need to modify the acquisition equipment or the vehicle. , In the static state, you can complete the calibration of the camera external parameters. This is one of the inventive points of the present invention.
- the loss function takes into account the attitude and position of the IMU and the GPS coordinates of the marked points; in the mathematical model, the loss function is associated with the rotation translation transformation matrix, and the specified threshold is set by reprojection error To adjust the value of M [R
- FIG. 1 is a step diagram of a static calibration method for external parameters of a camera according to the present invention.
- Three-dimensional maps have good performance effects, and have the characteristics of small data volume, fast transmission speed, and low cost. After years of development, three-dimensional maps have been widely used in social management, public services, publicity and display fields.
- 3D map modelers use high-resolution satellite maps or aerial photographs as reference maps based on the data collected by the collectors to complete the 3D modeling work by stitching the images; however, aerial photography or drone photography has many disadvantages, such as The cost is too high and the resolution is not high.
- This application provides a new data acquisition method for three-dimensional maps, specifically by installing a 360-degree all-around high-resolution camera device on the vehicle, such as four high-resolution cameras on the front and back of the vehicle, Cameras are installed on the front, back, left and right of the vehicle, and the vehicle traverses each street intersection. At the same time, the camera on the vehicle captures high-resolution images as the reference map, and then the stitching of the images completes the 3D modeling work.
- Camera external parameters are parameters in the world coordinate system, such as camera position, rotation direction, etc.
- the camera external parameter calibration method is the mapping of world coordinates to pixel coordinates.
- world coordinates are considered to be defined, and calibration is a known calibration control.
- the world coordinates and pixel coordinates of the point we go to solve this mapping relationship. Once this relationship is solved, we can infer its world coordinates from the pixel coordinates of the point.
- the purpose of camera calibration is to determine the values of some camera parameters.
- the staff can use these parameters to map points in a three-dimensional space to the image space, or vice versa.
- the invention is a static calibration method.
- the marking point can be a special marking plate or a marking rod.
- the GPS position of the marker point is known. This position can be measured in advance and obtained directly during calibration, or it can be measured directly using RTK equipment during calibration.
- the rotation parameters of the three axes are ( ⁇ , ⁇ , ⁇ ), and then the 3 * 3 rotation matrix of each axis is combined (that is, the matrix is multiplied first) to obtain
- the three-axis rotation information R is collected, and its size is still 3 * 3; the translation parameters (Tx, Ty, Tz) of the three axes of t.
- a plurality of high-resolution cameras or cameras on the top of the vehicle can realize 360-degree high-definition video recording or taking pictures.
- the method of calibrating the external parameters of the camera includes the following steps:
- Step S1 Set one or more marked points, and obtain the image including the marked points by the camera on the top of the vehicle, and identify the position (u0, v0) of the marked points in the image.
- the position of the marker point in the image is the observation value.
- the marker point here can be a special marker card or a marker rod. Because it occupies a small area and the GPS position is known, the accuracy of the acquired signal is higher; the marker point The GPS position is known.
- the position can be measured in advance and obtained directly during calibration, or it can be measured directly using RTK equipment during calibration.
- Step S2 The actual GPS position of the marked point is converted to the IMU coordinate system with the IMU as the origin of the coordinates to obtain the point P1.
- the longitude and latitude height system of GPS is an ellipsoidal system, not orthogonal, but the rotation translation transformation matrix M [R
- the coordinates of this point are a three-dimensional coordinate (X, Y, Z), which indicates the position of this marked point relative to the IMU.
- t], R represents rotation
- t represents translation.
- Step S3 P1 is transformed into the camera coordinate system with the camera as the coordinate origin by rotating the translation transformation matrix M [R
- Step S4 P2 is transformed into the image coordinate system by the projection matrix of the camera itself, and the projection coordinates (u1, v1) of the point on the image are obtained.
- the projection coordinates on the image are estimated values.
- the projection matrix here is the internal parameter matrix of the camera. This step is actually converting the three-dimensional point coordinate P2 into the theoretical two-dimensional coordinate of the point on the camera image through the internal parameter matrix of the camera. (u1, v1).
- Step S5 Construct the reprojection error functions of (u0, v0) and (u1, v1), and calculate the reprojection error, which is related to M [R
- K the internal parameter matrix of the camera
- R b , t b respectively the attitude and position of the IMU
- Step S6 Optimize the value of M [R
- the value of] is used as the calibration result.
- t] by the method of least squares is the prior art.
- multiple high-resolution cameras or cameras can be set around the vehicle, as long as 360-degree high-definition photography or photography is achieved.
- the calibration method of the camera's external parameters includes the following steps:
- Step S1 Set one or more marked points, and the cameras or cameras around the vehicle obtain an image including the marked points, and identify the position (u0, v0) of the marked points in the image.
- the position coordinates of the marked points in the image can be calculated by a trained neural network.
- the neural network was previously trained through a large amount of existing data (that is, the positions of the known marked points in the image have corresponding coordinates).
- Step S2 The actual GPS position of the marked point is converted to the IMU coordinate system with the IMU as the origin of the coordinates to obtain the point P1.
- the GPS position and IMU coordinates are both three-dimensional coordinates, and the resulting point P1 is also a three-dimensional coordinate.
- Step S3 P1 is transformed into the camera coordinate system with the camera as the coordinate origin by rotating the translation transformation matrix M [R
- the point P2 here is also a three-dimensional coordinate.
- Step S4 P2 is transformed into the image coordinate system by the projection matrix of the camera itself, and the projection coordinates (u1, v1) of the point on the image are obtained.
- the projection matrix here is the internal parameter matrix of the camera. This step is actually converting the three-dimensional point coordinate P2 into the theoretical two-dimensional coordinate (u1, v1) of the point on the camera image through the internal parameter matrix of the camera.
- Step S5 Construct the reprojection error functions of (u0, v0) and (u1, v1), and calculate the reprojection error, which is related to M [R
- K the internal parameter matrix of the camera
- R b , t b respectively the attitude and position of the IMU
- Step S6 Optimize the value of M [R
- the present invention illustrates the detailed structural features of the present invention through the foregoing embodiments, but the present invention is not limited to the detailed structural features described above, which does not mean that the present invention must rely on the detailed structural features described above to be implemented.
- Those skilled in the art should understand that any improvement to the present invention, equivalent replacement of the components used in the present invention, addition of auxiliary components, selection of specific methods, etc., all fall within the scope of protection and disclosure of the present invention.
Abstract
Description
第i个标记点的GPS坐标;[Corrected 16.01.2019 in accordance with Rule 26]
GPS coordinates of the i-th marker point;
第i个标记点在图像中的位置。[Corrected 16.01.2019 in accordance with Rule 26]
The position of the i-th marker point in the image.
第i个标记点的GPS坐标;[Corrected 16.01.2019 in accordance with Rule 26]
GPS coordinates of the i-th marker point;
第i个标记点在图像中的位置。[Corrected 16.01.2019 in accordance with Rule 26]
The position of the i-th marker point in the image.
第i个标记点的GPS坐标;[Corrected 16.01.2019 in accordance with Rule 26]
GPS coordinates of the i-th marker point;
第i个标记点在图像中的位置。该位置也是观测值,可以是人工标识出的。[Corrected 16.01.2019 in accordance with Rule 26]
The position of the i-th marker point in the image. This location is also an observation and can be manually identified.
其中 为第i个标记点的坐标(u1,v1),是估计值。[Corrected 16.01.2019 in accordance with Rule 26]
among them Is the coordinate (u1, v1) of the i-th marker point, and is an estimated value.
第i个标记点的GPS坐标;[Corrected 16.01.2019 in accordance with Rule 26]
GPS coordinates of the i-th marker point;
第i个标记点在图像中的位置。该位置也是观测值,可以是人工标识出的。[Corrected 16.01.2019 in accordance with Rule 26]
The position of the i-th marker point in the image. This location is also an observation and can be manually identified.
其中 为第i个标记点的坐标(u1,v1),是估计值。[Corrected 16.01.2019 in accordance with Rule 26]
among them Is the coordinate (u1, v1) of the i-th marker point, and is an estimated value.
Claims (9)
- 一种相机外部参数的标定方法,其特征在于,所述方法包括以下步骤:A method for calibrating external parameters of a camera, wherein the method includes the following steps:步骤S1:设定一个或多个标记点,通过摄像装置获取包括所述标记点的图像,识别标记点在图像中的位置(u0,v0);Step S1: Set one or more marked points, acquire an image including the marked points through a camera device, and identify the position (u0, v0) of the marked points in the image;步骤S2:将标记点的实际GPS位置转换到以IMU为坐标原点的IMU坐标系下,得到点P1;Step S2: The actual GPS position of the marked point is converted to the IMU coordinate system with the IMU as the origin of the coordinate to obtain the point P1;步骤S3:通过旋转平移变换矩阵M[R|t]将P1转换到以所述摄像装置为坐标原点的坐标系下,得到点P2;Step S3: P1 is transformed into a coordinate system with the camera device as the coordinate origin by rotating the translation transformation matrix M [R | t] to obtain point P2;步骤S4:通过所述摄像装置自身的内参矩阵K将P2转换到所述图像的坐标系下,得到该点在图像上的投影坐标(u1,v1);Step S4: P2 is transformed into the coordinate system of the image by the internal parameter matrix K of the camera device itself to obtain the projection coordinates (u1, v1) of the point on the image;步骤S5:构建(u0,v0)和(u1,v1)的重投影误差函数,计算重投影误差,该误差与所述旋转平移变换矩阵M[R|t]相关;其中所述重投影误差函数如下所示:Step S5: construct re-projection error functions of (u0, v0) and (u1, v1), and calculate re-projection errors, which are related to the rotation translation transformation matrix M [R | t]; wherein the re-projection error functions As follows:其中,K:相机的内参矩阵;Among them, K: the internal parameter matrix of the camera;R b,t b:分别为IMU的姿态和位置; R b , t b : respectively the attitude and position of the IMU;步骤S6:优化M[R|t]的值,重复步骤S3-S5,直至重投影误差低于指定阈值,此时M[R|t]的取值作为标定的结果;Step S6: Optimize the value of M [R | t], and repeat steps S3-S5 until the reprojection error is lower than the specified threshold, at which time the value of M [R | t] is used as the calibration result;所述旋转平移变换矩阵M[R|t]中R代表旋转,t代表平移。In the rotation and translation transformation matrix M [R | t], R represents rotation, and t represents translation.
- 根据权利要求1所述的方法,其特征在于:步骤S1中,所述摄像装置设置在车顶或车辆的四周。The method according to claim 1, wherein in step S1, the camera device is disposed on a roof of a vehicle or around a vehicle.
- 根据权利要求1-2任一项所述的方法,其特征在于:所述摄像装置包括多台摄像机,各摄像机获取的图像之间有重叠区域。The method according to any one of claims 1-2, wherein the camera device comprises a plurality of cameras, and there is an overlapping area between the images acquired by each camera.
- 根据权利要求1-3任一项所述的方法,其特征在于:步骤S1中,所述标记点在图像中的位置(u0,v0)通过人为或计算机逐行逐列查找得到。The method according to any one of claims 1-3, characterized in that, in step S1, the position (u0, v0) of the marked point in the image is obtained by human or computer by row and column.
- 根据权利要求1-4任一项所述的方法,其特征在于:步骤S1中,所述标记点在图像 中的位置(u0,v0)通过训练好的神经网络得到。The method according to any one of claims 1-4, wherein in step S1, the position (u0, v0) of the marked point in the image is obtained through a trained neural network.
- 根据权利要求1-6中任一项所述的方法,其特征在于:在步骤S6中,通过最小二乘法来优化M[R|t]的值。The method according to any one of claims 1-6, wherein in step S6, the value of M [R | t] is optimized by a least square method.
- 根据权利要求1-7中任一项所述的方法,其特征在于:所述旋转R、所述平移t的初始值都取0。The method according to any one of claims 1 to 7, wherein the initial values of the rotation R and the translation t are both taken as 0.
- 根据权利要求1-8中任一项所述的方法,其特征在于:步骤S1中,所述标记点的数为10-20个。The method according to any one of claims 1-8, wherein in step S1, the number of the marked points is 10-20.
- 根据权利要求1-9中任一项所述的方法,其特征在于:步骤S2中,所述标记点的实际GPS位置是预先测量出,或者是在标定时直接使用RTK设备测量出的。The method according to any one of claims 1-9, wherein in step S2, the actual GPS position of the marked point is measured in advance, or directly measured using an RTK device during calibration.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811154021.3 | 2018-09-30 | ||
CN201811154021.3A CN110969663B (en) | 2018-09-30 | 2018-09-30 | Static calibration method for external parameters of camera |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020062434A1 true WO2020062434A1 (en) | 2020-04-02 |
Family
ID=69952814
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/113667 WO2020062434A1 (en) | 2018-09-30 | 2018-11-02 | Static calibration method for external parameters of camera |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110969663B (en) |
WO (1) | WO2020062434A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112116529A (en) * | 2020-09-23 | 2020-12-22 | 浙江浩腾电子科技股份有限公司 | PTZ camera-based conversion method for GPS coordinates and pixel coordinates |
CN112465920A (en) * | 2020-12-08 | 2021-03-09 | 广州小鹏自动驾驶科技有限公司 | Vision sensor calibration method and device |
CN112489111A (en) * | 2020-11-25 | 2021-03-12 | 深圳地平线机器人科技有限公司 | Camera external parameter calibration method and device and camera external parameter calibration system |
CN112767498A (en) * | 2021-02-03 | 2021-05-07 | 苏州挚途科技有限公司 | Camera calibration method and device and electronic equipment |
CN113284193A (en) * | 2021-06-22 | 2021-08-20 | 智道网联科技(北京)有限公司 | Calibration method, device and equipment of RS equipment |
CN112037285B (en) * | 2020-08-24 | 2024-03-29 | 上海电力大学 | Camera calibration method based on Levy flight and variation mechanism gray wolf optimization |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112419420B (en) * | 2020-09-17 | 2022-01-28 | 腾讯科技(深圳)有限公司 | Camera calibration method and device, electronic equipment and storage medium |
CN112611361A (en) * | 2020-12-08 | 2021-04-06 | 华南理工大学 | Method for measuring installation error of camera of airborne surveying and mapping pod of unmanned aerial vehicle |
CN112802123B (en) * | 2021-01-21 | 2023-10-27 | 北京科技大学设计研究院有限公司 | Binocular linear array camera static calibration method based on stripe virtual target |
CN112837381A (en) * | 2021-02-09 | 2021-05-25 | 上海振华重工(集团)股份有限公司 | Camera calibration method, system and equipment suitable for driving equipment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102010029110A1 (en) * | 2009-05-29 | 2010-12-02 | Mori Seiki Co., Ltd., Yamatokoriyama-shi | Calibration method and calibration device |
CN103049912A (en) * | 2012-12-21 | 2013-04-17 | 浙江大学 | Random trihedron-based radar-camera system external parameter calibration method |
CN103593836A (en) * | 2012-08-14 | 2014-02-19 | 无锡维森智能传感技术有限公司 | A Camera parameter calculating method and a method for determining vehicle body posture with cameras |
CN103745452A (en) * | 2013-11-26 | 2014-04-23 | 理光软件研究所(北京)有限公司 | Camera external parameter assessment method and device, and camera external parameter calibration method and device |
CN103985118A (en) * | 2014-04-28 | 2014-08-13 | 无锡观智视觉科技有限公司 | Parameter calibration method for cameras of vehicle-mounted all-round view system |
CN105547228A (en) * | 2015-12-09 | 2016-05-04 | 中国航空工业集团公司西安飞机设计研究所 | Angle displacement measurement device |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010045766A (en) * | 2008-07-16 | 2010-02-25 | Kodaira Associates Kk | System for calibrating urban landscape image information |
CN104766058B (en) * | 2015-03-31 | 2018-04-27 | 百度在线网络技术(北京)有限公司 | A kind of method and apparatus for obtaining lane line |
US10338209B2 (en) * | 2015-04-28 | 2019-07-02 | Edh Us Llc | Systems to track a moving sports object |
US9892558B2 (en) * | 2016-02-19 | 2018-02-13 | The Boeing Company | Methods for localization using geotagged photographs and three-dimensional visualization |
CN105930819B (en) * | 2016-05-06 | 2019-04-12 | 西安交通大学 | Real-time city traffic lamp identifying system based on monocular vision and GPS integrated navigation system |
CN107067437B (en) * | 2016-12-28 | 2020-02-21 | 中国航天电子技术研究院 | Unmanned aerial vehicle positioning system and method based on multi-view geometry and bundle adjustment |
CN108279428B (en) * | 2017-01-05 | 2020-10-16 | 武汉四维图新科技有限公司 | Map data evaluating device and system, data acquisition system, acquisition vehicle and acquisition base station |
CN106803273B (en) * | 2017-01-17 | 2019-11-22 | 湖南优象科技有限公司 | A kind of panoramic camera scaling method |
CN108364252A (en) * | 2018-01-12 | 2018-08-03 | 深圳市粒视界科技有限公司 | A kind of correction of more fish eye lens panorama cameras and scaling method |
CN108288294A (en) * | 2018-01-17 | 2018-07-17 | 视缘(上海)智能科技有限公司 | A kind of outer ginseng scaling method of a 3D phases group of planes |
CN108230381B (en) * | 2018-01-17 | 2020-05-19 | 华中科技大学 | Multi-view stereoscopic vision method combining space propagation and pixel level optimization |
-
2018
- 2018-09-30 CN CN201811154021.3A patent/CN110969663B/en active Active
- 2018-11-02 WO PCT/CN2018/113667 patent/WO2020062434A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102010029110A1 (en) * | 2009-05-29 | 2010-12-02 | Mori Seiki Co., Ltd., Yamatokoriyama-shi | Calibration method and calibration device |
CN103593836A (en) * | 2012-08-14 | 2014-02-19 | 无锡维森智能传感技术有限公司 | A Camera parameter calculating method and a method for determining vehicle body posture with cameras |
CN103049912A (en) * | 2012-12-21 | 2013-04-17 | 浙江大学 | Random trihedron-based radar-camera system external parameter calibration method |
CN103745452A (en) * | 2013-11-26 | 2014-04-23 | 理光软件研究所(北京)有限公司 | Camera external parameter assessment method and device, and camera external parameter calibration method and device |
CN103985118A (en) * | 2014-04-28 | 2014-08-13 | 无锡观智视觉科技有限公司 | Parameter calibration method for cameras of vehicle-mounted all-round view system |
CN105547228A (en) * | 2015-12-09 | 2016-05-04 | 中国航空工业集团公司西安飞机设计研究所 | Angle displacement measurement device |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112037285B (en) * | 2020-08-24 | 2024-03-29 | 上海电力大学 | Camera calibration method based on Levy flight and variation mechanism gray wolf optimization |
CN112116529A (en) * | 2020-09-23 | 2020-12-22 | 浙江浩腾电子科技股份有限公司 | PTZ camera-based conversion method for GPS coordinates and pixel coordinates |
CN112489111A (en) * | 2020-11-25 | 2021-03-12 | 深圳地平线机器人科技有限公司 | Camera external parameter calibration method and device and camera external parameter calibration system |
CN112489111B (en) * | 2020-11-25 | 2024-01-30 | 深圳地平线机器人科技有限公司 | Camera external parameter calibration method and device and camera external parameter calibration system |
CN112465920A (en) * | 2020-12-08 | 2021-03-09 | 广州小鹏自动驾驶科技有限公司 | Vision sensor calibration method and device |
CN112767498A (en) * | 2021-02-03 | 2021-05-07 | 苏州挚途科技有限公司 | Camera calibration method and device and electronic equipment |
CN113284193A (en) * | 2021-06-22 | 2021-08-20 | 智道网联科技(北京)有限公司 | Calibration method, device and equipment of RS equipment |
CN113284193B (en) * | 2021-06-22 | 2024-02-02 | 智道网联科技(北京)有限公司 | Calibration method, device and equipment of RS equipment |
Also Published As
Publication number | Publication date |
---|---|
CN110969663B (en) | 2023-10-03 |
CN110969663A (en) | 2020-04-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020062434A1 (en) | Static calibration method for external parameters of camera | |
CN112894832B (en) | Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium | |
CN104330074B (en) | Intelligent surveying and mapping platform and realizing method thereof | |
US10127721B2 (en) | Method and system for displaying and navigating an optimal multi-dimensional building model | |
CN108168521A (en) | One kind realizes landscape three-dimensional visualization method based on unmanned plane | |
CN107179086A (en) | A kind of drafting method based on laser radar, apparatus and system | |
CN107504957A (en) | The method that three-dimensional terrain model structure is quickly carried out using unmanned plane multi-visual angle filming | |
CN107194989A (en) | The scene of a traffic accident three-dimensional reconstruction system and method taken photo by plane based on unmanned plane aircraft | |
CN103426165A (en) | Precise registration method of ground laser-point clouds and unmanned aerial vehicle image reconstruction point clouds | |
CA2705809A1 (en) | Method and apparatus of taking aerial surveys | |
CN100417231C (en) | Three-dimensional vision semi-matter simulating system and method | |
CN103226838A (en) | Real-time spatial positioning method for mobile monitoring target in geographical scene | |
CN101545776B (en) | Method for obtaining digital photo orientation elements based on digital map | |
CN103310487B (en) | A kind of universal imaging geometric model based on time variable generates method | |
KR20200110120A (en) | A system implementing management solution of road facility based on 3D-VR multi-sensor system and a method thereof | |
CN110706273B (en) | Real-time collapse area measurement method based on unmanned aerial vehicle | |
CN111612901A (en) | Extraction feature and generation method of geographic information image | |
CN115962757A (en) | Unmanned aerial vehicle surveying and mapping method, system and readable storage medium | |
Koeva | 3D modelling and interactive web-based visualization of cultural heritage objects | |
CN113808269A (en) | Map generation method, positioning method, system and computer readable storage medium | |
CN104133874B (en) | Streetscape image generating method based on true color point cloud | |
CN111964665B (en) | Intelligent vehicle positioning method and system based on vehicle-mounted all-around image and storage medium | |
CN113034347A (en) | Oblique photographic image processing method, device, processing equipment and storage medium | |
CN112348941A (en) | Real-time fusion method and device based on point cloud and image data | |
CN116823966A (en) | Internal reference calibration method and device for camera, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18935019 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18935019 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18935019 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 16.11.2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18935019 Country of ref document: EP Kind code of ref document: A1 |