CN110969664B - Dynamic calibration method for external parameters of camera - Google Patents

Dynamic calibration method for external parameters of camera Download PDF

Info

Publication number
CN110969664B
CN110969664B CN201811154031.7A CN201811154031A CN110969664B CN 110969664 B CN110969664 B CN 110969664B CN 201811154031 A CN201811154031 A CN 201811154031A CN 110969664 B CN110969664 B CN 110969664B
Authority
CN
China
Prior art keywords
image
camera
point
vehicle
imu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811154031.7A
Other languages
Chinese (zh)
Other versions
CN110969664A (en
Inventor
李晓东
陈亮
芦超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Momenta Technology Co ltd
Original Assignee
Beijing Momenta Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Momenta Technology Co ltd filed Critical Beijing Momenta Technology Co ltd
Priority to CN201811154031.7A priority Critical patent/CN110969664B/en
Publication of CN110969664A publication Critical patent/CN110969664A/en
Application granted granted Critical
Publication of CN110969664B publication Critical patent/CN110969664B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

A dynamic calibration method for external parameters of camera; comprises the following steps of S1: setting one or more marking points and acquiring the corresponding GPS positions in the slow running process of the vehicle; step S2: projecting the one or more marking points into an image acquired by the vehicle camera device through a transformation matrix M [ T|t ] to obtain theoretical projection coordinates of the one or more marking points on the image; step S3: the loss function associated with T, t was constructed and the error calculated. The calibration of T, T can be completed at one time by constructing a rotation translation transformation matrix T and a time delay T-related loss function between the coordinate observed value of the mark point in the image pickup device and the coordinate estimated value obtained through a series of transformations.

Description

Dynamic calibration method for external parameters of camera
Technical Field
The application relates to the field of intelligent driving, in particular to a dynamic calibration method for external parameters of a camera.
Background
Currently, the accuracy of an autopilot map is measured under the GPS coordinate system, that is, every point on the map needs to be represented by GPS coordinates. In a multi-sensor approach, the camera and GPS may not be mounted at the same location of the vehicle at the same time, but may be separated by a distance of 2-3 meters. Therefore, external parameters of the camera need to be calibrated, and a spatial position relation between the camera and the GPS module is established. If the calibration of the camera external parameters is not carried out, the image is directly built according to the image of the camera and the position of the vehicle body, and finally, errors of two meters and three meters can be generated.
In the traditional calibration method, the camera and the IMU (the IMU is provided with a GPS module) are closely bound together, and then shake vigorously to excite the shaft of the IMU and the shaft of the camera to be stuck together, so that the locus of the IMU and the locus of the camera are stuck together, and the calibration of the camera is completed. However, the camera and IMU mounted on the vehicle cannot be rocked, so that the conventional calibration method is not applicable.
In addition, in the static calibration method of the external reference, at least 3 marking points need to be erected around the vehicle during each measurement, and in order to improve the calibration accuracy, data of multiple measurements need to be acquired. Therefore, the vehicle and the mark point are required to be continuously moved to finish one-time calibration, and the time for finishing one-time external parameter calibration by using a static calibration method is relatively long.
Disclosure of Invention
In view of the problems existing in the prior art, the application provides a dynamic calibration method for external parameters of a camera, which is characterized in that: the method comprises the following steps:
step S1: setting one or more marking points in the running process of the vehicle, and acquiring the corresponding GPS positions;
step S2: projecting the one or more marking points into an image acquired by the vehicle camera device through a transformation matrix M [ T|t ] to obtain theoretical projection coordinates of the one or more marking points on the image;
step S3: a loss function associated with T, T is constructed, and an error is calculated, which is associated with M t|t, as follows:
k, an internal reference matrix of the camera;
T b : the attitude and position of the IMU;
V b : three-axis speed of IMUA degree;
the GPS coordinates of the ith mark point;
the position of the ith mark point in the image;
t is a rotation translation transformation matrix, comprising rotation and translation;
t is the time delay;
step S4: and optimizing the value of M [ T|t ] by a least square method, and repeating the steps S1-S3 until the value of the loss function is lower than a specified threshold value, wherein the value of M [ T|t ] is used as a calibration result.
Preferably, the step S2 specifically includes the following steps:
step S21: converting the actual GPS positions of the one or more marking points into an IMU coordinate system with the IMU as an origin of coordinates to obtain a point P1;
step S22: converting P1 into a coordinate system with the image pickup device as a coordinate origin through a transformation matrix M [ T|t ] to obtain a point P2;
step S23: and converting P2 into an image coordinate system acquired by the vehicle camera through a projection matrix of the camera, and obtaining theoretical projection coordinates of the point on the image.
Preferably, in step S1, the image pickup device is disposed around a roof or a vehicle.
Preferably, the image pickup device includes a plurality of cameras, and an overlapping area is provided between images acquired by the cameras.
Preferably, in step S1, the actual position of the one or more marker points in the image is identified.
Preferably, in step S1, the positions of the one or more marker points in the image are obtained by means of a trained neural network.
Preferably, in step S4, the value of M [ t|t ] is optimized by the least square method.
Preferably, the initial value of the rotation translation transformation matrix T, the time delay T, is 0.
Preferably, in step S1, the number of the marking points is 10-20.
Preferably, in step S21, the actual GPS position of the marker point may be measured in advance, or may be measured directly using the RTK apparatus at the time of calibration.
The application resides in, but is not limited to, the following aspects:
(1) The traditional punctuation method needs to bind a camera and an IMU (a GPS module is arranged on the IMU) together very closely and shake the IMU severely, and when a vehicle is stationary, the traditional method cannot be used. This is one of the points of the application.
(2) A dynamic calibration method for external parameters of a camera is provided; the user can finish the calibration of T, t once by driving the vehicle for a certain distance without erecting special mark points. Moreover, compared with a static calibration method, the operation of continuously changing the vehicle position and moving the calibration point is required, and the time consumption of the dynamic calibration method is shorter. This is one of the points of the application.
(3) A brand new loss function is designed; the function can accurately and effectively calculate the transformation matrix M [ R|t ] and the time delay. The loss function takes into account the attitude and position of the IMU and the GPS coordinates of the marker points; the loss function is linked to the transformation matrix in a mathematical model, and the value of MR|t is adjusted as a result of the calibration by means of the reprojection error setting a specified threshold. This is one of the points of the application.
Drawings
FIG. 1 is a step diagram of a method for dynamically calibrating external parameters of a camera according to the application.
The present application will be described in further detail below. The following examples are merely illustrative of the present application and are not intended to represent or limit the scope of the application as defined in the claims.
Detailed Description
The technical scheme of the application is further described below by the specific embodiments with reference to the accompanying drawings.
For a better illustration of the present application, which is convenient for understanding the technical solution of the present application, exemplary but non-limiting examples of the present application are as follows:
the three-dimensional map has the advantages of good expression effect, small data volume, high transmission speed, low cost and the like, and is widely applied to the fields of social management, public service, propaganda display and the like after years of development.
Generally, three-dimensional modeling staff complete three-dimensional modeling work by splicing images by taking a high-resolution satellite map or an aerial photo as a reference map according to data acquired by acquisition staff; however, aerial photography or unmanned aerial vehicle photography has numerous disadvantages, such as excessive cost, low resolution, and the like. The application provides a brand new data acquisition mode of a three-dimensional map, which is characterized in that a 360-degree omnibearing high-resolution camera device is arranged on a vehicle, for example, 4 high-resolution cameras are arranged on the front, rear, left and right of a roof, and the camera devices are respectively arranged on the front, rear, left and right of the vehicle, the vehicle traverses each street crossing, the camera device on the vehicle shoots a high-resolution image as a reference map, and then the image is spliced to complete the three-dimensional modeling work
The external parameters of the camera are parameters in the world coordinate system, such as camera position, rotation direction, etc.; the method for calibrating the external parameters of the camera is the mapping from world coordinates to pixel coordinates, wherein the world coordinates are considered to be defined, the calibration is the world coordinates and the pixel coordinates of known calibration control points, the mapping relation is solved, and once the relation is solved, the world coordinates of the points can be reversely deduced by the pixel coordinates of the points;
the purpose of camera calibration is to determine the values of some parameters of the camera by which a worker can map points in a three-dimensional space to image space, or vice versa.
The application relates to a dynamic calibration method, wherein a vehicle is in a motion state during calibration, a marking point can be semantic features (such as a marking board, a lamp post and the like) on a road, and the marking point can be used for calibration as long as the GPS position of the marking point is known. The GPS position of the marker point is known, and can be measured in advance, directly obtained during calibration, or measured by using RTK equipment during calibration.
Typically, there are 6 external parameters of the camera, the rotation parameters of the three axes are (ω, δ, θ), and then the 3*3 rotation matrices of each axis are combined (i.e. multiplied between the first matrices) to obtain the three axis rotation information, the size of which is 3*3; the translation parameters for the three axes are (Tx, ty, tz).
Compared with a static calibration method, the dynamic calibration method can solve T, t at one time, and is also a target of dynamic calibration. The static calibration method can only calculate T, if T needs to be calculated, or the vehicle needs to be run; where T is the rotational translational transformation matrix, including rotation and translation, and T is the time delay.
Regarding the time delay t: the time for the different sensor data to be transmitted to the CPU or the industrial personal computer is different, the CPU will time stamp the data when receiving the data, but the time stamp can only represent when the CPU receives the data, and cannot represent when the data occurs. Generally, since the processing of an image is time consuming, the image data and IMU data occur at the same time, the CPU will receive the image data at a later time than the IMU data, that is, the CPU will timestamp the image data later than the IMU data.
In the subsequent mapping or positioning, the sensor data of IMU, image, GPS, etc. used should be synchronized, i.e. should be simultaneous. However, since the CPU actually judges whether or not these data are synchronized by the time stamp, it can be understood from the above description of the time stamp generating method that if the time of the data is not compensated, the "synchronized" sensor data acquired by the CPU by reading the time stamp is actually unsynchronized. The time for this compensation is the time t we require, which in the following description is precisely the time delay of the image data relative to the IMU data. But in fact t may be the time delay of the image data with respect to any one of the sensors, which sensor is considered to be the reference for GPS.
Example 1
In this embodiment, through a plurality of high resolution cameras on the top of the vehicle, 360-degree high-definition shooting or photographing can be realized, and in a state where the vehicle is stationary, the calibration method for external parameters of the camera includes the following steps:
step S1: one or more marker points are set, the vehicle is driven, preferably while driving slowly, an image including the marker points is acquired by a camera at the top of the vehicle, the positions (u 0, v 0) of the marker points in the image are identified, and the GPS position of each marker point is obtained. The marking point can be a semantic feature (sign, lamp post, etc.) on the road, and no special settings are needed, so long as the GPS position of the point is known, the marking point can be used for calibration.
Step S2: one or more marker points are projected into the image by means of a transformation matrix M [ T|t ], resulting in theoretical projection coordinates (u 1, v 1) of the point on the image.
Step S3: constructing a loss function associated with T, T, calculating an error, the error being associated with M [ T|t ]; the loss error function is as follows:
k, an internal reference matrix of the camera;
T b : the attitude and position of the IMU;
V b : speed of IMU triaxial;
the GPS coordinates of the ith mark point;
the position of the ith mark point in the image; the location is also an observation and may be manually identified.
T is a rotation translation transformation matrix, comprising rotation and translation;
t is the time delay.
Wherein the method comprises the steps ofThe coordinates (u 1, v 1) of the i-th marker point are estimated values.
Step S4: and optimizing the value of M [ T|t ] by a least square method, and repeating the steps S1-S3 until the value of the loss function is lower than a specified threshold value, wherein the value of M [ T|t ] is used as a calibration result. The value of M [ T|t ] is optimized by a least square method, which is the prior art, and the value of M [ T|t ] is solved in a plurality of modes, for example, the value of M [ T|t ] is judged by derivation through a reprojection error function and the magnitude of a derivative; the initial value of the rotation translation transformation matrix T can be 0 for simplicity, and the true size of the rotation translation transformation matrix T is about 10 minus six times; the initial value of the time delay is also taken to be zero.
Example 2
In this embodiment, a plurality of high-resolution cameras or cameras may be disposed around a vehicle, so long as 360-degree high-definition shooting or photographing is achieved, the calibration method of external parameters of the cameras includes the following steps:
step S1: the vehicle runs slowly, cameras or cameras are arranged around the vehicle, multiple frames of images are acquired, and the positions (u 0, v 0) of one or more marking points in the images are identified. The position coordinates of the marker points in the image can be calculated by a trained neural network, which is trained by a large amount of existing data (namely, the coordinates of the known marker points in the image).
Step S2: and converting the actual GPS position of the marked point into an IMU coordinate system with the IMU as the origin of coordinates to obtain a point P1. Here, the GPS position and IMU coordinates are both three-dimensional coordinates, and the resulting point P1 is also a three-dimensional coordinate.
Step S3: p1 is converted into a camera coordinate system taking a camera as a coordinate origin through a transformation matrix M [ T|t ], and a point P2 is obtained. The point P2 here is also a three-dimensional coordinate.
Step S4: p2 is converted into an image coordinate system through a projection matrix of the camera, and projection coordinates (u 1, v 1) of the point on the image are obtained. The projection matrix here is the internal reference matrix of the camera, which is in effect the conversion of the three-dimensional point coordinates P2 into two-dimensional coordinates (u 1, v 1) of the point on the camera image in theory by means of the internal reference matrix of the camera.
Step S5: constructing a loss function associated with T, T, calculating an error, the error being associated with M [ T|t ]; the loss error function is as follows:
k, an internal reference matrix of the camera;
T b : the attitude and position of the IMU;
V b : speed of IMU triaxial;
the GPS coordinates of the ith mark point;
the position of the ith mark point in the image; the location is also an observation and may be manually identified.
T is a rotation translation transformation matrix, comprising rotation and translation;
t is the time delay.
Wherein the method comprises the steps ofFor the ith markThe coordinates (u 1, v 1) of the point are estimated values.
Step S6: and optimizing the value of M [ T|t ] by a least square method, repeating the steps S1-S5, and continuously iterating the value of M [ T|t ] until the value of the loss function is lower than a specified threshold value, wherein the value of M [ T|t ] is used as a calibration result. The value of M [ T|t ] is optimized by a least square method, which is the prior art, and the value of M [ T|t ] is solved in a plurality of modes, for example, the value of M [ T|t ] is judged by derivation through a reprojection error function and the magnitude of a derivative; the initial value of the rotation translation transformation matrix T can be 0 for simplicity, and the true size of the rotation translation transformation matrix T is about 10 minus six times; the initial value of the time delay is also taken to be zero.
The applicant states that the detailed structural features of the present application are described by the above embodiments, but the present application is not limited to the above detailed structural features, i.e. it does not mean that the present application must be implemented depending on the above detailed structural features. It should be apparent to those skilled in the art that any modifications of the present application, equivalent substitutions of selected components of the present application, addition of auxiliary components, selection of specific modes, etc., are within the scope of the present application and the scope of the disclosure.
The preferred embodiments of the present application have been described in detail above, but the present application is not limited to the specific details of the above embodiments, and various simple modifications can be made to the technical solution of the present application within the scope of the technical concept of the present application, and all the simple modifications belong to the protection scope of the present application.
In addition, the specific features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various possible combinations are not described further.
Moreover, any combination of the various embodiments of the application can be made without departing from the spirit of the application, which should also be considered as disclosed herein.

Claims (9)

1. A dynamic calibration method for external parameters of a camera is characterized in that: the method comprises the following steps:
step S1: setting one or more marking points in the running process of the vehicle, and acquiring the corresponding GPS positions;
step S2: projecting the one or more marking points into an image acquired by the vehicle camera device through a transformation matrix M [ T|t ] to obtain estimated values of projection coordinates of the one or more marking points on the image;
step S3: a loss function associated with T, T is constructed, and an error is calculated, which is associated with M t|t, as follows:
wherein, K: an internal reference matrix of the camera;
T b : the attitude and position of the inertial measurement unit IMU;
V b : speed of IMU triaxial;
the GPS coordinates of the ith mark point;
the position of the ith mark point in the image;
t is a rotation translation transformation matrix, comprising rotation and translation;
t is the time delay;
step S4: and optimizing the value of M [ T|t ] by a least square method, and repeating the steps S1-S3 until the value of the loss function is lower than a specified threshold value, wherein the value of M [ T|t ] is used as a calibration result.
2. The method according to claim 1, characterized in that: the step S2 specifically includes the following steps:
step S21: converting the actual GPS positions of the one or more marking points into an IMU coordinate system with the IMU as an origin of coordinates to obtain a point P1;
step S22: converting P1 into a coordinate system with the image pickup device as a coordinate origin through a transformation matrix M [ T|t ] to obtain a point P2;
step S23: and converting P2 into an image coordinate system acquired by the vehicle camera through a projection matrix of the camera, and obtaining theoretical projection coordinates of the point on the image.
3. The method according to claim 1 or 2, characterized in that: in step S1, the imaging device is disposed around a roof or a vehicle.
4. A method according to claim 3, characterized in that: the image pickup device comprises a plurality of cameras, and an overlapping area is formed between images acquired by the cameras.
5. The method according to claim 1, characterized in that: in step S1, the actual position of the one or more marker points in the image is identified.
6. The method according to claim 1, characterized in that: in step S1, the positions of the one or more marker points in the image are obtained by means of a trained neural network.
7. The method according to claim 1, characterized in that: the initial values of the rotation translation transformation matrix T and the time delay T are 0.
8. The method according to claim 1, characterized in that: in step S1, the number of the marked points is 10-20.
9. The method according to claim 1, characterized in that: in step S21, the actual GPS position of the marker point is measured in advance, or is measured directly by using the RTK apparatus during calibration.
CN201811154031.7A 2018-09-30 2018-09-30 Dynamic calibration method for external parameters of camera Active CN110969664B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811154031.7A CN110969664B (en) 2018-09-30 2018-09-30 Dynamic calibration method for external parameters of camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811154031.7A CN110969664B (en) 2018-09-30 2018-09-30 Dynamic calibration method for external parameters of camera

Publications (2)

Publication Number Publication Date
CN110969664A CN110969664A (en) 2020-04-07
CN110969664B true CN110969664B (en) 2023-10-24

Family

ID=70028639

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811154031.7A Active CN110969664B (en) 2018-09-30 2018-09-30 Dynamic calibration method for external parameters of camera

Country Status (1)

Country Link
CN (1) CN110969664B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101271639B1 (en) * 2011-12-13 2013-06-17 (주)팜비젼 A extrinsic parameter calibration method and system for camera on mobile device
CN103593836A (en) * 2012-08-14 2014-02-19 无锡维森智能传感技术有限公司 A Camera parameter calculating method and a method for determining vehicle body posture with cameras
CN103985118A (en) * 2014-04-28 2014-08-13 无锡观智视觉科技有限公司 Parameter calibration method for cameras of vehicle-mounted all-round view system
EP2808645A1 (en) * 2012-01-23 2014-12-03 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
JP2015106785A (en) * 2013-11-29 2015-06-08 クラリオン株式会社 Camera calibration device
CN104833372A (en) * 2015-04-13 2015-08-12 武汉海达数云技术有限公司 External parameter calibration method of high-definition panoramic camera of mobile measuring system
CN105844624A (en) * 2016-03-18 2016-08-10 上海欧菲智能车联科技有限公司 Dynamic calibration system, and combined optimization method and combined optimization device in dynamic calibration system
DE102014117888A1 (en) * 2014-12-04 2016-10-13 Connaught Electronics Ltd. Online calibration of a motor vehicle camera system
CN106803273A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 A kind of panoramic camera scaling method
WO2017161608A1 (en) * 2016-03-21 2017-09-28 完美幻境(北京)科技有限公司 Geometric calibration processing method and device for camera
WO2017195801A1 (en) * 2016-05-13 2017-11-16 オリンパス株式会社 Calibration device, calibration method, optical device, imaging device, projection device, measurement system and measurement method
EP3246875A2 (en) * 2016-05-18 2017-11-22 Siemens Healthcare GmbH Method and system for image registration using an intelligent artificial agent
FR3053554A1 (en) * 2016-06-30 2018-01-05 Continental Automotive France METHOD FOR RAPID CALIBRATION OF A MOTOR VEHICLE CAMERA
CN108090456A (en) * 2017-12-27 2018-05-29 北京初速度科技有限公司 A kind of Lane detection method and device
CN108171748A (en) * 2018-01-23 2018-06-15 哈工大机器人(合肥)国际创新研究院 A kind of visual identity of object manipulator intelligent grabbing application and localization method
CN108182704A (en) * 2016-12-08 2018-06-19 广州映博智能科技有限公司 Localization method based on Shape context feature
CN108345875A (en) * 2018-04-08 2018-07-31 北京初速度科技有限公司 Wheeled region detection model training method, detection method and device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4751939B2 (en) * 2009-03-31 2011-08-17 アイシン精機株式会社 Car camera calibration system
US9386302B2 (en) * 2014-05-21 2016-07-05 GM Global Technology Operations LLC Automatic calibration of extrinsic and intrinsic camera parameters for surround-view camera system
EP2960859B1 (en) * 2014-06-19 2019-05-01 Tata Consultancy Services Limited Constructing a 3d structure
US10455138B2 (en) * 2015-04-20 2019-10-22 Ian Schillebeeckx Camera calibration with lenticular arrays
GB2541884A (en) * 2015-08-28 2017-03-08 Imp College Of Science Tech And Medicine Mapping a space using a multi-directional camera

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101271639B1 (en) * 2011-12-13 2013-06-17 (주)팜비젼 A extrinsic parameter calibration method and system for camera on mobile device
EP2808645A1 (en) * 2012-01-23 2014-12-03 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
CN103593836A (en) * 2012-08-14 2014-02-19 无锡维森智能传感技术有限公司 A Camera parameter calculating method and a method for determining vehicle body posture with cameras
JP2015106785A (en) * 2013-11-29 2015-06-08 クラリオン株式会社 Camera calibration device
CN103985118A (en) * 2014-04-28 2014-08-13 无锡观智视觉科技有限公司 Parameter calibration method for cameras of vehicle-mounted all-round view system
DE102014117888A1 (en) * 2014-12-04 2016-10-13 Connaught Electronics Ltd. Online calibration of a motor vehicle camera system
CN104833372A (en) * 2015-04-13 2015-08-12 武汉海达数云技术有限公司 External parameter calibration method of high-definition panoramic camera of mobile measuring system
CN105844624A (en) * 2016-03-18 2016-08-10 上海欧菲智能车联科技有限公司 Dynamic calibration system, and combined optimization method and combined optimization device in dynamic calibration system
WO2017161608A1 (en) * 2016-03-21 2017-09-28 完美幻境(北京)科技有限公司 Geometric calibration processing method and device for camera
WO2017195801A1 (en) * 2016-05-13 2017-11-16 オリンパス株式会社 Calibration device, calibration method, optical device, imaging device, projection device, measurement system and measurement method
EP3246875A2 (en) * 2016-05-18 2017-11-22 Siemens Healthcare GmbH Method and system for image registration using an intelligent artificial agent
FR3053554A1 (en) * 2016-06-30 2018-01-05 Continental Automotive France METHOD FOR RAPID CALIBRATION OF A MOTOR VEHICLE CAMERA
CN108182704A (en) * 2016-12-08 2018-06-19 广州映博智能科技有限公司 Localization method based on Shape context feature
CN106803273A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 A kind of panoramic camera scaling method
CN108090456A (en) * 2017-12-27 2018-05-29 北京初速度科技有限公司 A kind of Lane detection method and device
CN108171748A (en) * 2018-01-23 2018-06-15 哈工大机器人(合肥)国际创新研究院 A kind of visual identity of object manipulator intelligent grabbing application and localization method
CN108345875A (en) * 2018-04-08 2018-07-31 北京初速度科技有限公司 Wheeled region detection model training method, detection method and device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
一种基于多灭点标定数码相机内外参数的方法;王德麾;樊庆文;袁中凡;;四川大学学报(工程科学版)(第05期);第193-196页 *
全厚德 ; 王建华 ; 赵波 ; .大视场远距离摄像机外部参数标定算法研究.电光与控制.2009,(第11期),全文. *
王卓君 ; 杨群丰 ; 范凯 ; 沙鑫美 ; .一种相机标定方法的探讨.机电工程技术.2015,第44卷(第12期),全文. *
闫龙 ; 华臻 ; 陈成军 ; 杜慧秋 ; .摄影测量系统仿真及验证技术研究.系统仿真学报.2013,(第06期),全文. *

Also Published As

Publication number Publication date
CN110969664A (en) 2020-04-07

Similar Documents

Publication Publication Date Title
CN110969663B (en) Static calibration method for external parameters of camera
US11802769B2 (en) Lane line positioning method and apparatus, and storage medium thereof
US10509983B2 (en) Operating device, operating system, operating method, and program therefor
CN109887057B (en) Method and device for generating high-precision map
CN111750853B (en) Map establishing method, device and storage medium
WO2014034064A1 (en) Image processing device and storage medium
CN109544630B (en) Pose information determination method and device and visual point cloud construction method and device
US20080050042A1 (en) Hardware-in-the-loop simulation system and method for computer vision
CN109374008A (en) A kind of image capturing system and method based on three mesh cameras
CN109690622A (en) Camera registration in multicamera system
CA2526105A1 (en) Image display method and image display apparatus
CN112577517A (en) Multi-element positioning sensor combined calibration method and system
CN113870343A (en) Relative pose calibration method and device, computer equipment and storage medium
CN111415387A (en) Camera pose determining method and device, electronic equipment and storage medium
CN110260857A (en) Calibration method, device and the storage medium of vision map
CN108603933A (en) The system and method exported for merging the sensor with different resolution
CN113587934B (en) Robot, indoor positioning method and device and readable storage medium
CN109978954A (en) The method and apparatus of radar and camera combined calibrating based on cabinet
CN103411587A (en) Positioning and attitude-determining method and system
Zhao et al. Direct georeferencing of oblique and vertical imagery in different coordinate systems
CN114777768A (en) High-precision positioning method and system for satellite rejection environment and electronic equipment
CN113296133A (en) Device and method for realizing position calibration based on binocular vision measurement and high-precision positioning fusion technology
JP5214355B2 (en) Vehicle traveling locus observation system, vehicle traveling locus observation method, and program thereof
CN111383282A (en) Pose information determination method and device
CN110969664B (en) Dynamic calibration method for external parameters of camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220301

Address after: 100083 unit 501, block AB, Dongsheng building, No. 8, Zhongguancun East Road, Haidian District, Beijing

Applicant after: BEIJING MOMENTA TECHNOLOGY Co.,Ltd.

Address before: Room 28, 4 / F, block a, Dongsheng building, No. 8, Zhongguancun East Road, Haidian District, Beijing 100089

Applicant before: BEIJING CHUSUDU TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant