CN110969664A - Dynamic calibration method for external parameters of camera - Google Patents

Dynamic calibration method for external parameters of camera Download PDF

Info

Publication number
CN110969664A
CN110969664A CN201811154031.7A CN201811154031A CN110969664A CN 110969664 A CN110969664 A CN 110969664A CN 201811154031 A CN201811154031 A CN 201811154031A CN 110969664 A CN110969664 A CN 110969664A
Authority
CN
China
Prior art keywords
image
camera
points
vehicle
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811154031.7A
Other languages
Chinese (zh)
Other versions
CN110969664B (en
Inventor
李晓东
陈亮
芦超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Momenta Technology Co Ltd
Original Assignee
Beijing Chusudu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Chusudu Technology Co ltd filed Critical Beijing Chusudu Technology Co ltd
Priority to CN201811154031.7A priority Critical patent/CN110969664B/en
Publication of CN110969664A publication Critical patent/CN110969664A/en
Application granted granted Critical
Publication of CN110969664B publication Critical patent/CN110969664B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

A dynamic calibration method of external parameters of a camera; includes step S1: setting one or more mark points and acquiring the corresponding position of a GPS (global positioning system) in the process of slowly driving the vehicle; step S2: projecting the one or more mark points to an image acquired by the vehicle camera device through a transformation matrix M [ T | T ] to obtain theoretical projection coordinates of the one or more mark points on the image; step S3: a loss function associated with T, t is constructed and the error calculated. The T, T calibration can be completed at one time by constructing a loss function related to a rotation-translation transformation matrix T and a time delay T between a coordinate observation value of a mark point in the camera device and a coordinate estimation value obtained through a series of transformations.

Description

Dynamic calibration method for external parameters of camera
Technical Field
The invention relates to the field of intelligent driving, in particular to a dynamic calibration method for external parameters of a camera.
Background
Currently, the accuracy of an autopilot map is measured in a GPS coordinate system, that is, each point on the map needs to be represented by a GPS coordinate. In a multi-sensor solution, the camera and GPS may not be mounted at the same time in the same location of the vehicle, but may be separated by a distance of 2-3 meters. Therefore, external parameters of the camera need to be calibrated to establish a spatial position relationship between the camera and the GPS module. If the external parameters of the camera are not calibrated, the image is directly constructed according to the image of the camera and the position of the vehicle body, and finally an error of two meters and three meters can be generated.
In the conventional calibration method, the camera and the IMU (the IMU is provided with the GPS module) need to be bound together very closely, and then the camera and the IMU are strongly shaken to excite the axis of the IMU and the axis of the camera to be attached together, so that the trajectory of the IMU and the trajectory of the camera are attached to each other, and the calibration of the camera is completed. However, the camera and IMU mounted on the vehicle cannot shake, so the conventional calibration method is not applicable.
In the static calibration method of the external reference, at least 3 marking points need to be erected around the vehicle during each measurement, and in order to improve the calibration accuracy, data of multiple measurements need to be collected. Therefore, the vehicle and the marking point need to be continuously moved when the calibration is completed once, and the time consumption for completing the external reference calibration once by using the static calibration method is long.
Disclosure of Invention
In view of the problems existing in the prior art, the invention provides a dynamic calibration method for external parameters of a camera, which is characterized in that: the method comprises the following steps:
step S1: setting one or more mark points and acquiring the corresponding position of a GPS (global positioning system) in the driving process of a vehicle;
step S2: projecting the one or more mark points to an image acquired by the vehicle camera device through a transformation matrix M [ T | T ] to obtain theoretical projection coordinates of the one or more mark points on the image;
step S3: constructing a loss function associated with T, T, calculating an error associated with M [ T | T ], the loss error function being as follows:
Figure BDA0001818566840000011
k is an internal reference matrix of the camera;
Tb: pose and position of the IMU;
Vb: the velocity of the IMU three axes;
Figure BDA0001818566840000021
the GPS coordinate of the ith mark point;
Figure BDA0001818566840000022
the position of the ith marking point in the image;
t is a rotational-translational transformation matrix, including rotation and translation;
t is the time delay;
step S4: and optimizing the value of M [ T | T ] by a least square method, and repeating the steps S1-S3 until the value of the loss function is lower than a specified threshold, wherein the value of M [ T | T ] is taken as a calibration result.
Preferably, the step S2 specifically includes the following steps:
step S21: converting the actual GPS positions of the one or more marked points to an IMU coordinate system with the IMU as a coordinate origin to obtain a point P1;
step S22: converting P1 into a coordinate system with the camera as a coordinate origin through a transformation matrix M [ T | T ] to obtain a point P2;
step S23: and converting the P2 into an image coordinate system acquired by the vehicle camera device through the projection matrix of the camera device, and obtaining the theoretical projection coordinate of the point on the image.
Preferably, in step S1, the image pickup devices are disposed on the roof or around the vehicle.
Preferably, the camera device comprises a plurality of cameras, and images acquired by the cameras have an overlapping area.
Preferably, in step S1, the actual position of the one or more marker points in the image is identified.
Preferably, in step S1, the position of the one or more marker points in the image is obtained by a trained neural network.
Preferably, in step S4, the value of M [ T | T ] is optimized by the least squares method.
Preferably, the initial values of the rotation-translation transformation matrix T and the time delay T are 0.
Preferably, in step S1, the number of the marked points is 10-20.
Preferably, in step S21, the actual GPS position of the marker may be measured in advance, or may be measured directly by using an RTK device during calibration.
The invention is characterized by, but not limited to, the following aspects:
(1) the traditional method for marking points needs to bind a camera and an IMU (a GPS module is arranged on the IMU) together very closely and then shake violently, but when a vehicle is static, the traditional method cannot be used. This is one of the points of the present invention.
(2) Providing a dynamic calibration method of external parameters of a camera; the user can finish the T, t calibration at one time by driving the vehicle to a certain distance without erecting special marking points. Moreover, compared with the static calibration method which needs to continuously change the vehicle position and move the calibration point, the dynamic calibration method consumes less time. This is one of the points of the present invention.
(3) A brand-new loss function is designed; the transformation matrix M [ R | t ] and the time delay can be accurately and effectively solved through the function. The loss function takes into account the attitude and position of the IMU and the GPS coordinates of the marked points; the loss function is associated with the transformation matrix in the mathematical model and the value of M R | t is adjusted as a result of the calibration by setting a specified threshold for the reprojection error. This is one of the points of the present invention.
Drawings
FIG. 1 is a step diagram of a dynamic calibration method for external parameters of a camera according to the present invention.
The present invention is described in further detail below. The following examples are merely illustrative of the present invention and do not represent or limit the scope of the claims, which are defined by the claims.
Detailed Description
The technical scheme of the invention is further explained by the specific implementation mode in combination with the attached drawings.
To better illustrate the invention and to facilitate the understanding of the technical solutions thereof, typical but non-limiting examples of the invention are as follows:
the three-dimensional map has good expression effect, has the characteristics of small data volume, high transmission speed, low cost and the like, and is widely applied to the fields of social management, public service, propaganda and display and the like through development for many years.
Generally, a three-dimensional map modeling worker completes three-dimensional modeling work by splicing images by taking a high-resolution satellite map or aerial photograph as a reference map according to data acquired by an acquisition worker; however, aerial photography or unmanned aerial vehicle shooting has many disadvantages, such as high cost and low resolution. The application provides a brand-new data acquisition mode of a three-dimensional map, and particularly relates to a method for acquiring data by installing a 360-degree omnibearing high-resolution camera device on a vehicle, wherein the camera device is installed on the roof of the vehicle, such as 4 high-resolution cameras around the vehicle, or the camera devices are respectively installed around the vehicle, the vehicle traverses each street crossing, simultaneously the camera device on the vehicle shoots high-resolution images as a reference map, and then the images are spliced to complete three-dimensional modeling work
Extrinsic parameters of the camera are parameters in the world coordinate system such as camera position, rotation direction, etc.; the camera external parameter calibration method is the mapping from world coordinates to pixel coordinates, the world coordinates are considered to be defined, the calibration is the world coordinates and the pixel coordinates of the known calibration control points, the mapping relation is solved, and once the relation is solved, the world coordinates of the points can be reversely deduced by the pixel coordinates of the points;
the purpose of camera calibration is to determine the values of parameters of the camera by which the staff can map points in a three-dimensional space to image space, or vice versa.
The invention relates to a dynamic calibration method, wherein a vehicle is in a motion state during calibration, and a marker can be a semantic feature (a marker plate, a lamp post and the like) on a road without special setting and can be used for calibration as long as the GPS position of the marker is known. The GPS position of the marker is known, and the GPS position may be measured in advance, directly acquired during calibration, or directly measured using RTK equipment during calibration.
Generally, there are 6 external parameters of the camera, the rotation parameters of three axes are (ω, δ, θ), and then 3 × 3 rotation matrices of each axis are combined (i.e. multiplication between matrices is performed first) to obtain the set three-axis rotation information, the size of which is also 3 × 3; the translation parameters for the three axes are (Tx, Ty, Tz).
Compared with the static calibration method, the dynamic calibration method can solve T, t at one time, which is also the target of dynamic calibration. The static calibration method can only calculate T, and if T needs to be calculated, the vehicle needs to run; where T is the rotational-translational transformation matrix, including rotation and translation, and T is the time delay.
With respect to the time delay t: different sensor data are transmitted to the CPU or the industrial personal computer at different time, the CPU stamps the data when receiving the data, but the time stamp can only represent when the CPU receives the data and cannot represent when the data is received. Generally, since image processing is time-consuming, the CPU receives image data later than IMU data, i.e. the CPU time-stamps the image data later than IMU data, because the image data and IMU data occur at the same time.
In the subsequent mapping or positioning, sensor data such as IMU, image, GPS, etc. should be synchronized, i.e. occur simultaneously. However, since the CPU actually determines whether the data are synchronized by the time stamp, it can be known from the above description of the time stamp generation method that if the time of the data is not compensated, the "synchronized" sensor data acquired by the CPU by reading the time stamp is not substantially synchronized. The time for this compensation is the t required, which is precisely the time delay of the image data relative to the IMU data in the following description. However, t may be a time delay of the image data with respect to any one of the sensors to see which sensor is the reference of the GPS.
Example 1
In this embodiment, a plurality of high resolution cameras on the top of the vehicle can realize 360-degree high-definition shooting or photographing, and the calibration method for the external parameters of the cameras in the static state of the vehicle includes the following steps:
step S1: setting one or more marker points, the vehicle is driven, preferably while driving slowly, by a camera at the top of the vehicle, acquiring an image including the marker points, identifying the positions of the marker points in the image (u0, v0), and obtaining the GPS position of each marker point. The point can be a semantic feature on the road (sign, light pole, etc.) and does not need to be set specifically, so long as the GPS position of the point is known, the point can be calibrated by using the mark.
Step S2: one or more marker points are projected into the image by means of a transformation matrix M T, resulting in theoretical projection coordinates of the point on the image (u1, v 1).
Step S3: constructing a loss function associated with T, T, calculating an error, the error being associated with M [ T | T ]; the loss error function is as follows:
Figure BDA0001818566840000051
k is an internal reference matrix of the camera;
Tb: pose and position of the IMU;
Vb: the velocity of the IMU three axes;
Figure BDA0001818566840000052
the GPS coordinate of the ith mark point;
Figure BDA0001818566840000053
the position of the ith marking point in the image; this location is also an observation and may be identified manually.
T is a rotational-translational transformation matrix, including rotation and translation;
t is the time delay.
Wherein
Figure BDA0001818566840000054
The coordinates (u1, v1) of the ith marker point are estimated values.
Step S4: and optimizing the value of M [ T | T ] by a least square method, and repeating the steps S1-S3 until the value of the loss function is lower than a specified threshold, wherein the value of M [ T | T ] is taken as a calibration result. The method of optimizing the value of M [ T | T ] by the least square method is the prior art, and has various ways to solve the value of M [ T | T ], for example, the value of M [ T | T ] optimization can be determined by derivation of a reprojection error function and the magnitude of the derivative; the initial value of the rotation translation transformation matrix T can be 0 for simplicity, and the real size of the rotation translation transformation matrix T is about 10 minus six power; the initial value of the time delay also takes zero.
Example 2
In this embodiment, a plurality of high resolution video cameras or cameras may be arranged around the vehicle, as long as 360-degree high-definition video shooting or photographing is implemented, and the calibration method for the external parameters of the cameras includes the following steps:
step S1: the vehicle is driven slowly, cameras or cameras are arranged around the vehicle, a plurality of frames of images are acquired, and the positions of one or more marker points in the images are identified (u0, v 0). The position coordinates of the mark points in the image can be calculated through a trained neural network, and the neural network is trained through a large amount of existing data (namely the coordinates of the known mark points in the image are corresponding).
Step S2: the actual GPS position of the marked point is converted to an IMU coordinate system with the IMU as the origin of coordinates, resulting in point P1. Here, the GPS position and IMU coordinates are both three-dimensional coordinates, and the resulting point P1 is also a three-dimensional coordinate.
Step S3: converting P1 to a camera coordinate system with the camera as the origin of coordinates by transforming the matrix M [ T | T ] to obtain point P2. Here also point P2 is a three-dimensional coordinate.
Step S4: the projection coordinates of the point on the image are obtained by converting P2 into the image coordinate system through the projection matrix of the camera (u1, v 1). The projection matrix is the camera's internal reference matrix, and this step is actually converting the three-dimensional point coordinate P2 into the two-dimensional coordinates (u1, v1) of the point theoretically on the camera image by the camera's internal reference matrix.
Step S5: constructing a loss function associated with T, T, calculating an error, the error being associated with M [ T | T ]; the loss error function is as follows:
Figure BDA0001818566840000061
k is an internal reference matrix of the camera;
Tb: pose and position of the IMU;
Vb: the velocity of the IMU three axes;
Figure BDA0001818566840000062
the GPS coordinate of the ith mark point;
Figure BDA0001818566840000063
the position of the ith marking point in the image; this location is also an observation and may be identified manually.
T is a rotational-translational transformation matrix, including rotation and translation;
t is the time delay.
Wherein
Figure BDA0001818566840000064
The coordinates (u1, v1) of the ith marker point are estimated values.
Step S6: optimizing the value of M [ T | T ] by a least square method, repeating the steps S1-S5, and continuously iterating the value of M [ T | T ] until the value of the loss function is lower than a specified threshold value, wherein the value of M [ T | T ] is taken as a calibration result. The method of optimizing the value of M [ T | T ] by the least square method is the prior art, and has various ways to solve the value of M [ T | T ], for example, the value of M [ T | T ] optimization can be determined by derivation of a reprojection error function and the magnitude of the derivative; the initial value of the rotation translation transformation matrix T can be 0 for simplicity, and the real size of the rotation translation transformation matrix T is about 10 minus six power; the initial value of the time delay also takes zero.
The applicant declares that the present invention illustrates the detailed structural features of the present invention through the above embodiments, but the present invention is not limited to the above detailed structural features, that is, it does not mean that the present invention must be implemented depending on the above detailed structural features. It should be understood by those skilled in the art that any modifications of the present invention, equivalent substitutions of selected components of the present invention, additions of auxiliary components, selection of specific modes, etc., are within the scope and disclosure of the present invention.
The preferred embodiments of the present invention have been described in detail, however, the present invention is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present invention within the technical idea of the present invention, and these simple modifications are within the protective scope of the present invention.
It should be noted that the various technical features described in the above embodiments can be combined in any suitable manner without contradiction, and the invention is not described in any way for the possible combinations in order to avoid unnecessary repetition.
In addition, any combination of the various embodiments of the present invention is also possible, and the same should be considered as the disclosure of the present invention as long as it does not depart from the spirit of the present invention.

Claims (10)

1. A dynamic calibration method for external parameters of a camera is characterized in that: the method comprises the following steps:
step S1: setting one or more mark points and acquiring the corresponding position of a GPS (global positioning system) in the driving process of a vehicle;
step S2: projecting the one or more mark points to an image acquired by the vehicle camera device through a transformation matrix M [ T | T ] to obtain an estimated value of projection coordinates of the one or more mark points on the image;
step S3: constructing a loss function associated with T, T, calculating an error associated with M [ T | T ], the loss error function being as follows:
Figure FDA0001818566830000011
wherein, K: an internal reference matrix of the camera;
Tb: pose and position of the IMU;
Vb: the velocity of the IMU three axes;
Figure FDA0001818566830000012
the GPS coordinate of the ith mark point;
Figure FDA0001818566830000013
the position of the ith marking point in the image;
t is a rotational-translational transformation matrix, including rotation and translation;
t is the time delay.
Step S4: and optimizing the value of M [ T | T ] by a least square method, and repeating the steps S1-S3 until the value of the loss function is lower than a specified threshold, wherein the value of M [ T | T ] is taken as a calibration result.
2. The method of claim 1, wherein: the step S2 specifically includes the following steps:
step S21: converting the actual GPS positions of the one or more marked points to an IMU coordinate system with the IMU as a coordinate origin to obtain a point P1;
step S22: converting P1 into a coordinate system with the camera as a coordinate origin through a transformation matrix M [ T | T ] to obtain a point P2;
step S23: and converting the P2 into an image coordinate system acquired by the vehicle camera device through the projection matrix of the camera device, and obtaining the theoretical projection coordinate of the point on the image.
3. The method according to claims 1-2, characterized in that: in step S1, the image pickup devices are provided on the roof or around the vehicle.
4. The method of claim 3, wherein: the camera device comprises a plurality of cameras, and images acquired by the cameras have an overlapping area.
5. The method according to claims 1-4, characterized in that: in step S1, the actual position of the one or more marker points in the image is identified.
6. The method according to claims 1-5, characterized in that: in step S1, the position of the one or more marker points in the image is obtained through a trained neural network.
7. The method according to claims 1-6, characterized in that: in step S4, the value of M [ T | T ] is optimized by the least square method.
8. The method according to claims 1-7, characterized in that: the initial values of the rotation/translation transformation matrix T and the time delay T are 0.
9. The method according to claims 1-8, characterized in that: in step S1, the number of the marking points is 10-20.
10. The method according to claims 1-9, characterized in that: in step S21, the actual GPS position of the marker may be measured in advance, or may be measured directly by using an RTK device during calibration.
CN201811154031.7A 2018-09-30 2018-09-30 Dynamic calibration method for external parameters of camera Active CN110969664B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811154031.7A CN110969664B (en) 2018-09-30 2018-09-30 Dynamic calibration method for external parameters of camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811154031.7A CN110969664B (en) 2018-09-30 2018-09-30 Dynamic calibration method for external parameters of camera

Publications (2)

Publication Number Publication Date
CN110969664A true CN110969664A (en) 2020-04-07
CN110969664B CN110969664B (en) 2023-10-24

Family

ID=70028639

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811154031.7A Active CN110969664B (en) 2018-09-30 2018-09-30 Dynamic calibration method for external parameters of camera

Country Status (1)

Country Link
CN (1) CN110969664B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114708333A (en) * 2022-03-08 2022-07-05 智道网联科技(北京)有限公司 Method and device for generating external reference model of automatic calibration camera
CN114708333B (en) * 2022-03-08 2024-05-31 智道网联科技(北京)有限公司 Method and device for generating automatic calibration camera external parameter model

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100245576A1 (en) * 2009-03-31 2010-09-30 Aisin Seiki Kabushiki Kaisha Calibrating apparatus for on-board camera of vehicle
KR101271639B1 (en) * 2011-12-13 2013-06-17 (주)팜비젼 A extrinsic parameter calibration method and system for camera on mobile device
CN103593836A (en) * 2012-08-14 2014-02-19 无锡维森智能传感技术有限公司 A Camera parameter calculating method and a method for determining vehicle body posture with cameras
CN103985118A (en) * 2014-04-28 2014-08-13 无锡观智视觉科技有限公司 Parameter calibration method for cameras of vehicle-mounted all-round view system
EP2808645A1 (en) * 2012-01-23 2014-12-03 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
JP2015106785A (en) * 2013-11-29 2015-06-08 クラリオン株式会社 Camera calibration device
CN104833372A (en) * 2015-04-13 2015-08-12 武汉海达数云技术有限公司 External parameter calibration method of high-definition panoramic camera of mobile measuring system
US20150341629A1 (en) * 2014-05-21 2015-11-26 GM Global Technology Operations LLC Automatic calibration of extrinsic and intrinsic camera parameters for surround-view camera system
US20150371396A1 (en) * 2014-06-19 2015-12-24 Tata Consultancy Services Limited Constructing a 3d structure
CN105844624A (en) * 2016-03-18 2016-08-10 上海欧菲智能车联科技有限公司 Dynamic calibration system, and combined optimization method and combined optimization device in dynamic calibration system
DE102014117888A1 (en) * 2014-12-04 2016-10-13 Connaught Electronics Ltd. Online calibration of a motor vehicle camera system
CN106803273A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 A kind of panoramic camera scaling method
WO2017161608A1 (en) * 2016-03-21 2017-09-28 完美幻境(北京)科技有限公司 Geometric calibration processing method and device for camera
WO2017195801A1 (en) * 2016-05-13 2017-11-16 オリンパス株式会社 Calibration device, calibration method, optical device, imaging device, projection device, measurement system and measurement method
EP3246875A2 (en) * 2016-05-18 2017-11-22 Siemens Healthcare GmbH Method and system for image registration using an intelligent artificial agent
FR3053554A1 (en) * 2016-06-30 2018-01-05 Continental Automotive France METHOD FOR RAPID CALIBRATION OF A MOTOR VEHICLE CAMERA
US20180131861A1 (en) * 2015-04-20 2018-05-10 Washington University Camera calibration with lenticular arrays
CN108090456A (en) * 2017-12-27 2018-05-29 北京初速度科技有限公司 A kind of Lane detection method and device
CN108171748A (en) * 2018-01-23 2018-06-15 哈工大机器人(合肥)国际创新研究院 A kind of visual identity of object manipulator intelligent grabbing application and localization method
CN108182704A (en) * 2016-12-08 2018-06-19 广州映博智能科技有限公司 Localization method based on Shape context feature
US20180189565A1 (en) * 2015-08-28 2018-07-05 Imperial College Of Science, Technology And Medicine Mapping a space using a multi-directional camera
CN108345875A (en) * 2018-04-08 2018-07-31 北京初速度科技有限公司 Wheeled region detection model training method, detection method and device

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100245576A1 (en) * 2009-03-31 2010-09-30 Aisin Seiki Kabushiki Kaisha Calibrating apparatus for on-board camera of vehicle
KR101271639B1 (en) * 2011-12-13 2013-06-17 (주)팜비젼 A extrinsic parameter calibration method and system for camera on mobile device
EP2808645A1 (en) * 2012-01-23 2014-12-03 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
CN103593836A (en) * 2012-08-14 2014-02-19 无锡维森智能传感技术有限公司 A Camera parameter calculating method and a method for determining vehicle body posture with cameras
JP2015106785A (en) * 2013-11-29 2015-06-08 クラリオン株式会社 Camera calibration device
CN103985118A (en) * 2014-04-28 2014-08-13 无锡观智视觉科技有限公司 Parameter calibration method for cameras of vehicle-mounted all-round view system
US20150341629A1 (en) * 2014-05-21 2015-11-26 GM Global Technology Operations LLC Automatic calibration of extrinsic and intrinsic camera parameters for surround-view camera system
US20150371396A1 (en) * 2014-06-19 2015-12-24 Tata Consultancy Services Limited Constructing a 3d structure
DE102014117888A1 (en) * 2014-12-04 2016-10-13 Connaught Electronics Ltd. Online calibration of a motor vehicle camera system
CN104833372A (en) * 2015-04-13 2015-08-12 武汉海达数云技术有限公司 External parameter calibration method of high-definition panoramic camera of mobile measuring system
US20180131861A1 (en) * 2015-04-20 2018-05-10 Washington University Camera calibration with lenticular arrays
US20180189565A1 (en) * 2015-08-28 2018-07-05 Imperial College Of Science, Technology And Medicine Mapping a space using a multi-directional camera
CN105844624A (en) * 2016-03-18 2016-08-10 上海欧菲智能车联科技有限公司 Dynamic calibration system, and combined optimization method and combined optimization device in dynamic calibration system
WO2017161608A1 (en) * 2016-03-21 2017-09-28 完美幻境(北京)科技有限公司 Geometric calibration processing method and device for camera
WO2017195801A1 (en) * 2016-05-13 2017-11-16 オリンパス株式会社 Calibration device, calibration method, optical device, imaging device, projection device, measurement system and measurement method
EP3246875A2 (en) * 2016-05-18 2017-11-22 Siemens Healthcare GmbH Method and system for image registration using an intelligent artificial agent
FR3053554A1 (en) * 2016-06-30 2018-01-05 Continental Automotive France METHOD FOR RAPID CALIBRATION OF A MOTOR VEHICLE CAMERA
CN108182704A (en) * 2016-12-08 2018-06-19 广州映博智能科技有限公司 Localization method based on Shape context feature
CN106803273A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 A kind of panoramic camera scaling method
CN108090456A (en) * 2017-12-27 2018-05-29 北京初速度科技有限公司 A kind of Lane detection method and device
CN108171748A (en) * 2018-01-23 2018-06-15 哈工大机器人(合肥)国际创新研究院 A kind of visual identity of object manipulator intelligent grabbing application and localization method
CN108345875A (en) * 2018-04-08 2018-07-31 北京初速度科技有限公司 Wheeled region detection model training method, detection method and device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
全厚德;王建华;赵波;: "大视场远距离摄像机外部参数标定算法研究", no. 11 *
王卓君;杨群丰;范凯;沙鑫美;: "一种相机标定方法的探讨", vol. 44, no. 12 *
王德麾;樊庆文;袁中凡;: "一种基于多灭点标定数码相机内外参数的方法", 四川大学学报(工程科学版), no. 05, pages 193 - 196 *
闫龙;华臻;陈成军;杜慧秋;: "摄影测量系统仿真及验证技术研究", no. 06 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114708333A (en) * 2022-03-08 2022-07-05 智道网联科技(北京)有限公司 Method and device for generating external reference model of automatic calibration camera
CN114708333B (en) * 2022-03-08 2024-05-31 智道网联科技(北京)有限公司 Method and device for generating automatic calibration camera external parameter model

Also Published As

Publication number Publication date
CN110969664B (en) 2023-10-24

Similar Documents

Publication Publication Date Title
CN110969663B (en) Static calibration method for external parameters of camera
CN112894832B (en) Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium
US11802769B2 (en) Lane line positioning method and apparatus, and storage medium thereof
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
CN113870343B (en) Relative pose calibration method, device, computer equipment and storage medium
CN109374008A (en) A kind of image capturing system and method based on three mesh cameras
CN111830953A (en) Vehicle self-positioning method, device and system
CN112577517A (en) Multi-element positioning sensor combined calibration method and system
CN104217439A (en) Indoor visual positioning system and method
CN109146958B (en) Traffic sign space position measuring method based on two-dimensional image
CN103411587B (en) Positioning and orientation method and system
CN113587934B (en) Robot, indoor positioning method and device and readable storage medium
CN108603933A (en) The system and method exported for merging the sensor with different resolution
Zhou et al. Developing and testing robust autonomy: The university of sydney campus data set
CN114782548B (en) Global image-based radar data calibration method, device, equipment and medium
CN115439531A (en) Method and equipment for acquiring target space position information of target object
CN113296133A (en) Device and method for realizing position calibration based on binocular vision measurement and high-precision positioning fusion technology
CN116883604A (en) Three-dimensional modeling technical method based on space, air and ground images
CN115690612A (en) Unmanned aerial vehicle photoelectric image target search quantization indicating method, device and medium
CN114777768A (en) High-precision positioning method and system for satellite rejection environment and electronic equipment
CN115439528A (en) Method and equipment for acquiring image position information of target object
CN103260008A (en) Projection converting method from image position to actual position
Javed et al. PanoVILD: a challenging panoramic vision, inertial and LiDAR dataset for simultaneous localization and mapping
CN110969664B (en) Dynamic calibration method for external parameters of camera
CN109974660A (en) Method based on unmanned plane hovering video measuring unmanned plane hovering precision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220301

Address after: 100083 unit 501, block AB, Dongsheng building, No. 8, Zhongguancun East Road, Haidian District, Beijing

Applicant after: BEIJING MOMENTA TECHNOLOGY Co.,Ltd.

Address before: Room 28, 4 / F, block a, Dongsheng building, No. 8, Zhongguancun East Road, Haidian District, Beijing 100089

Applicant before: BEIJING CHUSUDU TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant