CN110969663B - Static calibration method for external parameters of camera - Google Patents

Static calibration method for external parameters of camera Download PDF

Info

Publication number
CN110969663B
CN110969663B CN201811154021.3A CN201811154021A CN110969663B CN 110969663 B CN110969663 B CN 110969663B CN 201811154021 A CN201811154021 A CN 201811154021A CN 110969663 B CN110969663 B CN 110969663B
Authority
CN
China
Prior art keywords
image
camera
point
imu
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811154021.3A
Other languages
Chinese (zh)
Other versions
CN110969663A (en
Inventor
陈亮
李晓东
芦超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Momenta Technology Co Ltd
Original Assignee
Beijing Momenta Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Momenta Technology Co Ltd filed Critical Beijing Momenta Technology Co Ltd
Priority to CN201811154021.3A priority Critical patent/CN110969663B/en
Priority to PCT/CN2018/113667 priority patent/WO2020062434A1/en
Publication of CN110969663A publication Critical patent/CN110969663A/en
Application granted granted Critical
Publication of CN110969663B publication Critical patent/CN110969663B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Navigation (AREA)

Abstract

A static calibration method for external parameters of a camera; comprises the following steps of S1: setting a plurality of mark points; step S2: converting the actual GPS position of the marked point into an IMU coordinate system with the IMU as a coordinate origin to obtain a point P1; step S3: converting P1 into a camera coordinate system with a camera as a coordinate origin by using a rotation translation transformation matrix to obtain P2; step S4: converting P2 into an image coordinate system through an internal reference matrix of the camera device to obtain a projection coordinate of the point on the image; step S5: calculating a re-projection error by the constructed re-projection error function; step S6: and optimizing the rotation translation transformation matrix, and repeating the steps S3-S5 until the re-projection error is lower than a specified threshold. The method constructs a loss function through GPS coordinates of the mark points and pixel coordinates of the mark points in the image pickup device, so as to finally obtain an European transformation relation between the two, namely a rotation translation transformation matrix. According to the method, the acquisition equipment is not required to be modified, the vehicle is not required to be modified, and the calibration of the camera external parameters can be completed in a static state.

Description

Static calibration method for external parameters of camera
Technical Field
The application relates to the field of intelligent driving, in particular to a static calibration method for external parameters of a camera.
Background
Currently, the accuracy of an autopilot map is measured under the GPS coordinate system, that is, every point on the map needs to be represented by GPS coordinates. In a multi-sensor approach, the camera and GPS may not be mounted at the same location of the vehicle at the same time, but may be separated by a distance of 2-3 meters. Therefore, external parameters of the camera need to be calibrated, and a spatial position relation between the camera and the GPS module is established. If the calibration of the camera external parameters is not carried out, the image is directly built according to the image of the camera and the position of the vehicle body, and finally, errors of two meters and three meters can be generated.
In the traditional calibration method, the camera and the IMU (the IMU is provided with a GPS module) are closely bound together, and then shake vigorously to excite the shaft of the IMU and the shaft of the camera to be stuck together, so that the locus of the IMU and the locus of the camera are stuck together, and the calibration of the camera is completed. However, the camera and IMU mounted on the vehicle cannot be rocked, so that the conventional calibration method is not applicable.
In addition, in the prior art, related research on how to build a relation between the reprojection and the transformation matrix to adjust the transformation matrix in the coordinate transformation process is not available.
Disclosure of Invention
In view of the problems existing in the prior art, the application adopts the following technical scheme:
a method for calibrating external parameters of a camera, the method comprising the steps of:
step S1: setting one or more mark points, acquiring an image comprising the mark points through an imaging device, and identifying the positions (u 0, v 0) of the mark points in the image;
step S2: converting the actual GPS position of the marked point into an IMU coordinate system with the IMU as a coordinate origin to obtain a point P1;
step S3: converting P1 into a coordinate system with the image pickup device as a coordinate origin by using a rotation translation transformation matrix M [ R|t ] to obtain a point P2;
step S4: converting P2 into a coordinate system of the image through an internal reference matrix K of the image pickup device to obtain projection coordinates (u 1, v 1) of the point on the image;
step S5: constructing a reprojection error function of (u 0, v 0) and (u 1, v 1), and calculating a reprojection error, wherein the reprojection error is related to the rotary translational transformation matrix M [ R|t ];
step S6: optimizing the value of M [ R|t ], repeating the steps S3-S5 until the re-projection error is lower than the specified threshold, wherein the value of M [ R|t ] is used as the calibration result;
r in the rotation translation transformation matrix M [ R|t ] represents rotation, and t represents translation.
Preferably, in step S1, the image capturing device is disposed around a roof or a vehicle.
Preferably, the image pickup device includes a plurality of cameras, and an overlapping area is provided between images acquired by the cameras.
Preferably, in step S1, the positions (u 0, v 0) of the marker points in the image are found by human or computer line by line.
Preferably, in step S1, the position (u 0, v 0) of the marker point in the image is obtained by means of a trained neural network.
Preferably, in step S5, the re-projection error function is as follows:
k is an internal reference matrix of the camera;
R b ,t b : the gesture and the position of the IMU are respectively;
the GPS coordinates of the ith mark point;
the position of the i-th marker point in the image.
Preferably, in step S6, the value of M [ R|t ] is optimized by the least squares method.
Preferably, the initial values of the rotation R and the translation t are all 0.
Preferably, in step S1, the number of the marking points is 10-20.
Preferably, in step S2, the actual GPS position of the marker point may be measured in advance, or may be measured directly by using the RTK apparatus at the time of calibration.
The application is characterized in the following aspects, but is not limited to the following aspects:
the application has the following points: (1) A static calibration method for external parameters of a camera is provided; the calibration of camera external parameters can be completed in a static state without modifying acquisition equipment or vehicles; the traditional punctuation method needs to bind a camera and an IMU (a GPS module is arranged on the IMU) together very closely and shake the IMU severely, and when a vehicle is stationary, the traditional method cannot be used. This is one of the points of the application.
(2) A brand new loss function is designed; the rotation translation transformation matrix can be accurately and effectively obtained through the function. This is mainly manifested in the following aspects: the loss function takes into account the attitude and position of the IMU and the GPS coordinates of the marker points; the loss function is linked with the rotation translation transformation matrix in the mathematical model, and the value of M [ R|t ] is adjusted by setting a specified threshold value through the reprojection error as the calibrated result. This is one of the points of the application.
Drawings
FIG. 1 is a step diagram of a camera external parameter static calibration method according to the present application.
The present application will be described in further detail below. The following examples are merely illustrative of the present application and are not intended to represent or limit the scope of the application as defined in the claims.
Detailed Description
The technical scheme of the application is further described below by the specific embodiments with reference to the accompanying drawings.
For a better illustration of the present application, which is convenient for understanding the technical solution of the present application, exemplary but non-limiting examples of the present application are as follows:
the three-dimensional map has the advantages of good expression effect, small data volume, high transmission speed, low cost and the like, and is widely applied to the fields of social management, public service, propaganda display and the like after years of development.
Generally, three-dimensional modeling staff complete three-dimensional modeling work by splicing images by taking a high-resolution satellite map or an aerial photo as a reference map according to data acquired by acquisition staff; however, aerial photography or unmanned aerial vehicle photography has numerous disadvantages, such as excessive cost, low resolution, and the like. The application provides a brand new data acquisition mode of a three-dimensional map, which is characterized in that a 360-degree omnibearing high-resolution camera device is arranged on a vehicle, for example, 4 high-resolution cameras are arranged on the front, rear, left and right of a roof, and the camera devices are respectively arranged on the front, rear, left and right of the vehicle, the vehicle traverses each street crossing, the camera device on the vehicle shoots a high-resolution image as a reference map, and then the image is spliced to complete the three-dimensional modeling work
The external parameters of the camera are parameters in the world coordinate system, such as camera position, rotation direction, etc.; the method for calibrating the external parameters of the camera is to map world coordinates to pixel coordinates, wherein the world coordinates are considered to be defined, calibration is to know the world coordinates and the pixel coordinates of calibration control points, the mapping relation is solved, and once the relation is solved, the world coordinates of the points can be reversely deduced by the pixel coordinates of the points.
The purpose of camera calibration is to determine the values of some parameters of the camera by which a worker can map points in a three-dimensional space to image space, or vice versa.
The application relates to a static calibration method, wherein the vehicle is static during calibration, and the position of a marking point is unchanged. The marking point may be a special sign, or a marking bar. The GPS position of the marker point is known, and can be measured in advance, directly obtained during calibration, or measured by using RTK equipment during calibration.
Typically, there are 6 external parameters of the camera, the rotation parameters of the three axes are (ω, δ, θ), and then the 3*3 rotation matrices of each axis are combined (i.e. multiplied between the first matrices) to obtain the three axis rotation information R, whose size is 3*3; translation parameters (Tx, ty, tz) of the three axes of t.
Example 1
In this embodiment, through a plurality of high resolution cameras or cameras on the top of the vehicle, 360-degree high definition shooting or photographing can be realized, and in a stationary state of the vehicle, the calibration method for external parameters of the camera includes the following steps:
step S1: one or more marker points are set, an image including the marker points is acquired by a camera at the roof of the vehicle, and the positions (u 0, v 0) of the marker points in the image are identified. The position of the marking point in the image is an observed value, the marking point can be a special marking plate or a marking rod, and the acquired signal is high in accuracy due to the fact that the marking point occupies a small area and the GPS position is known; the GPS position of the marker point is known, and can be measured in advance, directly obtained during calibration, or measured by using RTK equipment during calibration.
Because there are 6 external parameters of the camera, in order to calculate the corresponding parameters through the corresponding mathematical equation, at least 6 mark points are needed, usually in actual operation, in order to reduce the influence of data noise on the solution, about 10 to 20 mark points are selected, although theoretically, more mark points are more beneficial to the solution, but the cost of the solution is increased; the coordinates of the marking points can be found in the image shot by the camera by manually searching corresponding coordinates row by row and column by column, and can also be found by a computer.
Step S2: and converting the actual GPS position of the marked point into an IMU coordinate system with the IMU as the origin of coordinates to obtain a point P1. The longitude and latitude system of GPS is an ellipsoid system, not orthogonal, but the rotation translation transformation matrix M [ R|t ] is based on an orthogonal coordinate system. Thus, converting GPS points to an IMU coordinate system lower representation may convert the longitude and latitude high coordinates to Cartesian coordinates (i.e., an orthogonal coordinate system). The conversion relation is a standard conversion method in the GIS system. After conversion, the coordinates of the point are three-dimensional coordinates (X, Y, Z) representing the position of the marker point relative to the IMU. The rotation-translation transformation matrix M [ R|t ], R represents rotation, and t represents translation.
Step S3: and converting P1 into a camera coordinate system with a camera as a coordinate origin by rotating and translating a transformation matrix M [ R|t ] to obtain a point P2.
Step S4: p2 is converted into an image coordinate system through a projection matrix of the camera, and projection coordinates (u 1, v 1) of the point on the image are obtained. The projection coordinates on the image are estimated values, the projection matrix here being the reference matrix of the camera, this step is in fact the conversion of the three-dimensional point coordinates P2 into two-dimensional coordinates (u 1, v 1) of the point on the camera image in theory by means of the reference matrix of the camera.
Step S5: constructing a re-projection error function of (u 0, v 0) and (u 1, v 1), calculating a re-projection error, the error being related to M [ R|t ]; the reprojection error function is as follows:
k is an internal reference matrix of the camera;
R b ,t b : the gesture and the position of the IMU are respectively;
the GPS coordinates of the ith mark point;
the position of the i-th marker point in the image. The location is also an observation and may be manually identified.
Wherein the method comprises the steps ofIs the ithThe coordinates (u 1, v 1) of the marker point are estimated values.
Step S6: and optimizing the value of M [ R|t ] by a least square method, repeating the steps S3-S5, and continuously iterating the value of M [ R|t ] until the reprojection error is lower than a specified threshold value, wherein the value of M [ R|t ] is used as a calibration result. The value of M [ R|t ] is optimized by a least square method, which is the prior art, and the value of M [ R|t ] is solved in a plurality of modes, for example, the value of M [ R|t ] is judged by the derivative through derivation of a reprojection error function; the initial value of M R|t may be 0 for simplicity, with a true size around the minus six power of 10.
Example 2
In this embodiment, a plurality of high-resolution cameras or cameras may be disposed around a vehicle, so long as 360-degree high-definition shooting or photographing is achieved, the calibration method of external parameters of the cameras includes the following steps:
step S1: one or more marker points are set, an image including the marker points is acquired by a camera or a camera around the vehicle, and the positions (u 0, v 0) of the marker points in the image are identified. The position coordinates of the marker points in the image can be calculated by a trained neural network, which is trained by a large amount of existing data (namely, the coordinates of the known marker points in the image).
Step S2: and converting the actual GPS position of the marked point into an IMU coordinate system with the IMU as the origin of coordinates to obtain a point P1. Here, the GPS position and IMU coordinates are both three-dimensional coordinates, and the resulting point P1 is also a three-dimensional coordinate.
Step S3: and converting P1 into a camera coordinate system with a camera as a coordinate origin by rotating and translating a transformation matrix M [ R|t ] to obtain a point P2. The point P2 here is also a three-dimensional coordinate.
Step S4: p2 is converted into an image coordinate system through a projection matrix of the camera, and projection coordinates (u 1, v 1) of the point on the image are obtained. The projection matrix here is the internal reference matrix of the camera, which is in effect the conversion of the three-dimensional point coordinates P2 into two-dimensional coordinates (u 1, v 1) of the point on the camera image in theory by means of the internal reference matrix of the camera.
Step S5: constructing a re-projection error function of (u 0, v 0) and (u 1, v 1), calculating a re-projection error, the error being related to M [ R|t ]; the reprojection error function is as follows:
wherein K: an internal reference matrix of the camera;
R b ,t b : the gesture and the position of the IMU are respectively;
the GPS coordinates of the ith mark point;
the position of the i-th marker point in the image. The location is also an observation and may be manually identified.
Wherein the method comprises the steps ofThe coordinates (u 1, v 1) of the i-th marker point are estimated values.
Step S6: and optimizing the value of M [ R|t ] by a least square method, and repeating the steps S3-S5 until the re-projection error is lower than a specified threshold value, wherein the value of M [ R|t ] is used as a calibration result.
The applicant states that the detailed structural features of the present application are described by the above embodiments, but the present application is not limited to the above detailed structural features, i.e. it does not mean that the present application must be implemented depending on the above detailed structural features. It should be apparent to those skilled in the art that any modifications of the present application, equivalent substitutions of selected components of the present application, addition of auxiliary components, selection of specific modes, etc., are within the scope of the present application and the scope of the disclosure.
The preferred embodiments of the present application have been described in detail above, but the present application is not limited to the specific details of the above embodiments, and various simple modifications can be made to the technical solution of the present application within the scope of the technical concept of the present application, and all the simple modifications belong to the protection scope of the present application.
In addition, the specific features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various possible combinations are not described further.
Moreover, any combination of the various embodiments of the application can be made without departing from the spirit of the application, which should also be considered as disclosed herein.

Claims (9)

1. A method for calibrating external parameters of a camera, the method comprising the steps of:
step S1: setting one or more mark points, acquiring an image comprising the mark points through an imaging device, and identifying the positions (u 0, v 0) of the mark points in the image;
step S2: converting the actual GPS position of the marked point into an IMU coordinate system with the IMU as a coordinate origin to obtain a point P1;
step S3: converting P1 into a coordinate system with the image pickup device as a coordinate origin by using a rotation translation transformation matrix M [ R|t ] to obtain a point P2;
step S4: converting P2 into a coordinate system of the image through an internal reference matrix K of the image pickup device to obtain projection coordinates (u 1, v 1) of the point on the image;
step S5: constructing a reprojection error function of (u 0, v 0) and (u 1, v 1), and calculating a reprojection error, wherein the reprojection error is related to the rotary translational transformation matrix M [ R|t ];
step S6: optimizing the value of M [ R|t ], repeating the steps S3-S5 until the re-projection error is lower than the specified threshold, wherein the value of M [ R|t ] is used as the calibration result;
r in the rotation translation transformation matrix M [ R|t ] represents rotation, and t represents translation;
in step S5, the re-projection error function is as follows:
wherein K: an internal reference matrix of the camera;
R b ,t b : the gesture and the position of the IMU are respectively;
the GPS coordinates of the ith mark point;
the position of the i-th marker point in the image.
2. The method according to claim 1, characterized in that: in step S1, the image pickup device is disposed around a roof or a vehicle.
3. The method according to claim 1, characterized in that: the image pickup device comprises a plurality of cameras, and an overlapping area is formed between images acquired by the cameras.
4. The method according to claim 1, characterized in that: in step S1, the positions (u 0, v 0) of the marker points in the image are obtained by searching row by row and column by human or computer.
5. The method according to claim 1, characterized in that: in step S1, the positions (u 0, v 0) of the marker points in the image are obtained through a trained neural network.
6. The method according to claim 1, characterized in that: in step S6, the value of M [ R|t ] is optimized by the least squares method.
7. The method according to claim 1, characterized in that: the initial values of the rotation R and the translation t are all 0.
8. The method according to claim 1, characterized in that: in step S1, the number of the marked points is 10-20.
9. The method according to claim 1, characterized in that: in step S2, the actual GPS position of the marker point may be measured in advance, or may be measured directly by using the RTK apparatus during calibration.
CN201811154021.3A 2018-09-30 2018-09-30 Static calibration method for external parameters of camera Active CN110969663B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811154021.3A CN110969663B (en) 2018-09-30 2018-09-30 Static calibration method for external parameters of camera
PCT/CN2018/113667 WO2020062434A1 (en) 2018-09-30 2018-11-02 Static calibration method for external parameters of camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811154021.3A CN110969663B (en) 2018-09-30 2018-09-30 Static calibration method for external parameters of camera

Publications (2)

Publication Number Publication Date
CN110969663A CN110969663A (en) 2020-04-07
CN110969663B true CN110969663B (en) 2023-10-03

Family

ID=69952814

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811154021.3A Active CN110969663B (en) 2018-09-30 2018-09-30 Static calibration method for external parameters of camera

Country Status (2)

Country Link
CN (1) CN110969663B (en)
WO (1) WO2020062434A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112037285B (en) * 2020-08-24 2024-03-29 上海电力大学 Camera calibration method based on Levy flight and variation mechanism gray wolf optimization
CN112419420B (en) * 2020-09-17 2022-01-28 腾讯科技(深圳)有限公司 Camera calibration method and device, electronic equipment and storage medium
CN112116529A (en) * 2020-09-23 2020-12-22 浙江浩腾电子科技股份有限公司 PTZ camera-based conversion method for GPS coordinates and pixel coordinates
CN112489111B (en) * 2020-11-25 2024-01-30 深圳地平线机器人科技有限公司 Camera external parameter calibration method and device and camera external parameter calibration system
CN112465920A (en) * 2020-12-08 2021-03-09 广州小鹏自动驾驶科技有限公司 Vision sensor calibration method and device
CN112611361A (en) * 2020-12-08 2021-04-06 华南理工大学 Method for measuring installation error of camera of airborne surveying and mapping pod of unmanned aerial vehicle
CN112802123B (en) * 2021-01-21 2023-10-27 北京科技大学设计研究院有限公司 Binocular linear array camera static calibration method based on stripe virtual target
CN112767498A (en) * 2021-02-03 2021-05-07 苏州挚途科技有限公司 Camera calibration method and device and electronic equipment
CN113284193B (en) * 2021-06-22 2024-02-02 智道网联科技(北京)有限公司 Calibration method, device and equipment of RS equipment
CN113850873B (en) * 2021-09-24 2024-06-07 成都圭目机器人有限公司 Offset position calibration method of linear array camera under carrying platform positioning coordinate system
CN114463437A (en) * 2022-01-13 2022-05-10 湖南视比特机器人有限公司 Camera calibration method, device, equipment and computer readable medium
CN114612447A (en) * 2022-03-17 2022-06-10 广东美卡智能信息技术有限公司 Image processing method and device based on data calibration and image processing equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010045766A (en) * 2008-07-16 2010-02-25 Kodaira Associates Kk System for calibrating urban landscape image information
CN103985118A (en) * 2014-04-28 2014-08-13 无锡观智视觉科技有限公司 Parameter calibration method for cameras of vehicle-mounted all-round view system
CN104766058A (en) * 2015-03-31 2015-07-08 百度在线网络技术(北京)有限公司 Method and device for obtaining lane line
CN105930819A (en) * 2016-05-06 2016-09-07 西安交通大学 System for real-time identifying urban traffic lights based on single eye vision and GPS integrated navigation system
CN106803273A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 A kind of panoramic camera scaling method
CN107067437A (en) * 2016-12-28 2017-08-18 中国航天电子技术研究院 A kind of unmanned plane alignment system and method based on multiple view geometry and bundle adjustment
CN108230381A (en) * 2018-01-17 2018-06-29 华中科技大学 A kind of combination spatial and the multiple view Stereo Vision of Pixel-level optimization
CN108279428A (en) * 2017-01-05 2018-07-13 武汉四维图新科技有限公司 Map datum evaluating apparatus and system, data collecting system and collecting vehicle and acquisition base station
CN108288294A (en) * 2018-01-17 2018-07-17 视缘(上海)智能科技有限公司 A kind of outer ginseng scaling method of a 3D phases group of planes
CN108364252A (en) * 2018-01-12 2018-08-03 深圳市粒视界科技有限公司 A kind of correction of more fish eye lens panorama cameras and scaling method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8310539B2 (en) * 2009-05-29 2012-11-13 Mori Seiki Co., Ltd Calibration method and calibration device
CN103593836A (en) * 2012-08-14 2014-02-19 无锡维森智能传感技术有限公司 A Camera parameter calculating method and a method for determining vehicle body posture with cameras
CN103049912B (en) * 2012-12-21 2015-03-11 浙江大学 Random trihedron-based radar-camera system external parameter calibration method
CN103745452B (en) * 2013-11-26 2014-11-26 理光软件研究所(北京)有限公司 Camera external parameter assessment method and device, and camera external parameter calibration method and device
US11959749B2 (en) * 2014-06-20 2024-04-16 Profound Positioning Inc. Mobile mapping system
US10338209B2 (en) * 2015-04-28 2019-07-02 Edh Us Llc Systems to track a moving sports object
CN105547228B (en) * 2015-12-09 2018-10-09 中国航空工业集团公司西安飞机设计研究所 A kind of angle displacement measuring device
US9892558B2 (en) * 2016-02-19 2018-02-13 The Boeing Company Methods for localization using geotagged photographs and three-dimensional visualization

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010045766A (en) * 2008-07-16 2010-02-25 Kodaira Associates Kk System for calibrating urban landscape image information
CN103985118A (en) * 2014-04-28 2014-08-13 无锡观智视觉科技有限公司 Parameter calibration method for cameras of vehicle-mounted all-round view system
CN104766058A (en) * 2015-03-31 2015-07-08 百度在线网络技术(北京)有限公司 Method and device for obtaining lane line
CN105930819A (en) * 2016-05-06 2016-09-07 西安交通大学 System for real-time identifying urban traffic lights based on single eye vision and GPS integrated navigation system
CN107067437A (en) * 2016-12-28 2017-08-18 中国航天电子技术研究院 A kind of unmanned plane alignment system and method based on multiple view geometry and bundle adjustment
CN108279428A (en) * 2017-01-05 2018-07-13 武汉四维图新科技有限公司 Map datum evaluating apparatus and system, data collecting system and collecting vehicle and acquisition base station
CN106803273A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 A kind of panoramic camera scaling method
CN108364252A (en) * 2018-01-12 2018-08-03 深圳市粒视界科技有限公司 A kind of correction of more fish eye lens panorama cameras and scaling method
CN108230381A (en) * 2018-01-17 2018-06-29 华中科技大学 A kind of combination spatial and the multiple view Stereo Vision of Pixel-level optimization
CN108288294A (en) * 2018-01-17 2018-07-17 视缘(上海)智能科技有限公司 A kind of outer ginseng scaling method of a 3D phases group of planes

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
何国辉 ; 黄文军 ; .基于圆点阵列的多相机外部参数标定.五邑大学学报(自然科学版).2015,(第04期),全文. *
基于双视点特征匹配的激光-相机系统标定方法;张勤;贾庆轩;席宁;姜勇;;仪器仪表学报(第11期);第2623-2627页 *
张勤 ; 贾庆轩 ; 席宁 ; 姜勇 ; .基于双视点特征匹配的激光-相机系统标定方法.仪器仪表学报.2012,(第11期),全文. *
陈明伟 ; 朱登明 ; 毛天露 ; 王兆其 ; .基于二阶锥规划的运动控制相机轨迹跟踪研究.高技术通讯.2013,(第08期),全文. *

Also Published As

Publication number Publication date
CN110969663A (en) 2020-04-07
WO2020062434A1 (en) 2020-04-02

Similar Documents

Publication Publication Date Title
CN110969663B (en) Static calibration method for external parameters of camera
CN110264520B (en) Vehicle-mounted sensor and vehicle pose relation calibration method, device, equipment and medium
CN109461211B (en) Semantic vector map construction method and device based on visual point cloud and electronic equipment
CN109690622A (en) Camera registration in multicamera system
KR102295809B1 (en) Apparatus for acquisition distance for all directions of vehicle
CN112819903A (en) Camera and laser radar combined calibration method based on L-shaped calibration plate
CN111539484B (en) Method and device for training neural network
CN111815713A (en) Method and system for automatically calibrating external parameters of camera
CN110260857A (en) Calibration method, device and the storage medium of vision map
CN109118537B (en) Picture matching method, device, equipment and storage medium
CN114283201A (en) Camera calibration method and device and road side equipment
KR20200110120A (en) A system implementing management solution of road facility based on 3D-VR multi-sensor system and a method thereof
CN108603933A (en) The system and method exported for merging the sensor with different resolution
CN114782548B (en) Global image-based radar data calibration method, device, equipment and medium
CN115588040A (en) System and method for counting and positioning coordinates based on full-view imaging points
CN115439531A (en) Method and equipment for acquiring target space position information of target object
CN111145262B (en) Vehicle-mounted-based monocular calibration method
CN115439528A (en) Method and equipment for acquiring image position information of target object
CN114659523A (en) Large-range high-precision attitude measurement method and device
CN114777768A (en) High-precision positioning method and system for satellite rejection environment and electronic equipment
CN110969664B (en) Dynamic calibration method for external parameters of camera
CN117274338A (en) Unmanned aerial vehicle hyperspectral image alignment method, device, terminal and storage medium
CN113624223B (en) Indoor parking lot map construction method and device
CN109974660A (en) Method based on unmanned plane hovering video measuring unmanned plane hovering precision
WO2023283929A1 (en) Method and apparatus for calibrating external parameters of binocular camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220301

Address after: 100083 unit 501, block AB, Dongsheng building, No. 8, Zhongguancun East Road, Haidian District, Beijing

Applicant after: BEIJING MOMENTA TECHNOLOGY Co.,Ltd.

Address before: Room 28, 4 / F, block a, Dongsheng building, No. 8, Zhongguancun East Road, Haidian District, Beijing 100089

Applicant before: BEIJING CHUSUDU TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant