WO2017130823A1 - カメラ校正装置およびカメラ校正方法 - Google Patents

カメラ校正装置およびカメラ校正方法 Download PDF

Info

Publication number
WO2017130823A1
WO2017130823A1 PCT/JP2017/001637 JP2017001637W WO2017130823A1 WO 2017130823 A1 WO2017130823 A1 WO 2017130823A1 JP 2017001637 W JP2017001637 W JP 2017001637W WO 2017130823 A1 WO2017130823 A1 WO 2017130823A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
area
feature points
calibration
extracted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/001637
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
将由 道口
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Panasonic Intellectual Property Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Intellectual Property Management Co Ltd filed Critical Panasonic Intellectual Property Management Co Ltd
Priority to CN201780007829.6A priority Critical patent/CN108496354B/zh
Publication of WO2017130823A1 publication Critical patent/WO2017130823A1/ja
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to a camera calibration apparatus and a camera calibration method for calibrating a camera.
  • a technique is known in which a camera attached to a vehicle is used to image the surroundings of the vehicle and driving assistance is performed using the captured images.
  • a deviation from a design value occurs in a captured image due to an error in the mounting position of the camera and a manufacturing error in the camera itself. For this reason, it is necessary to calibrate the camera mounting position error in advance in a factory or the like.
  • the calibration marker is imaged by the camera, the feature points are extracted from the captured calibration marker image, and the camera parameters for calibrating the camera are adjusted based on the coordinates of the extracted feature points.
  • the camera calibration device includes a feature point extraction unit, an area calculation unit, and an adjustment unit.
  • the feature point extraction unit extracts a plurality of feature points from the calibration marker image captured by the camera.
  • the area calculation unit calculates the area of the figure defined by the feature points extracted by the feature point extraction unit.
  • the adjustment unit adjusts camera parameters for calibrating the camera based on the coordinates of the feature points extracted by the feature point extraction unit and the area of the graphic calculated by the area calculation unit.
  • Another aspect of the present disclosure is a camera calibration method. This method extracts a plurality of feature points from an image of a calibration marker imaged by a camera, calculates an area of a figure defined by the extracted feature points, and coordinates of the extracted feature points And adjusting camera parameters for calibrating the camera based on the calculated area of the figure.
  • the calibration accuracy of the camera can be improved.
  • FIG. 1A is a diagram illustrating a positional relationship between a vehicle and a calibration marker at the time of calibration according to an embodiment.
  • FIG. 1B is a plan view showing an example of the calibration marker of FIG. 1A.
  • FIG. 2 is a block diagram showing a schematic configuration of the camera calibration apparatus of FIG. 1A.
  • FIG. 3A is a diagram illustrating an image of a calibration marker in an ideal case where there is no error in the camera mounting position.
  • FIG. 3B is a diagram illustrating an image of the calibration marker in the case where there is an error in the yaw angle at the camera mounting position.
  • FIG. 4A is a diagram showing the feature points of FIG. 3A and FIG. 3B in an overlapping manner.
  • FIG. 4B is a diagram in which the feature points and figures of FIGS. 3A and 3B are overlapped.
  • FIG. 5 is a flowchart showing the processing of the camera calibration apparatus of FIG. 1A.
  • the present disclosure has been made in view of such circumstances, and the present disclosure provides a technique that can improve the calibration accuracy of a camera.
  • FIG. 1A is a diagram illustrating a positional relationship between the vehicle C1 and calibration markers M1 and M2 during calibration according to an embodiment
  • FIG. 1B is a plane illustrating an example of the calibration markers M1 and M2 in FIG. 1A.
  • FIG. 1A the surroundings of the vehicle C1 when the camera 10 is calibrated in a production factory of the vehicle C1 are viewed from above.
  • the vehicle C ⁇ b> 1 includes a camera 10 and a camera calibration device 20.
  • the camera 10 is attached to a back door or the like at the rear of the vehicle C1, and images the rear of the vehicle C1.
  • the camera 10 may be attached near the center axis of the vehicle, or may be attached offset from the center axis of the vehicle.
  • the camera calibration device 20 adjusts camera parameters for calibrating the camera 10.
  • the calibration markers M1 and M2 are arranged substantially perpendicular to the floor surface at a predetermined position behind the vehicle C1 within the imaging range of the camera 10.
  • the calibration markers M1 and M2 are arranged substantially symmetrically on both sides of the center axis of the vehicle.
  • each calibration marker M1, M2 has a checkered pattern in which 16 squares are arranged in a matrix.
  • the camera 10 images this checkerboard pattern.
  • FIG. 2 is a block diagram showing a schematic configuration of the camera calibration device 20 of FIG. 1A.
  • the camera calibration device 20 includes an image storage unit 22, a feature point extraction unit 24, an area calculation unit 26, and an adjustment unit 28.
  • the image storage unit 22 stores images of the calibration markers M1 and M2 captured by the camera 10.
  • the feature point extraction unit 24 extracts a plurality of feature points from the images of the calibration markers M1 and M2 stored in the image storage unit 22.
  • the method for extracting the feature points is not particularly limited, but the feature point extraction unit 24 may extract the feature points using a pattern matching technique.
  • the feature point extraction unit 24 scans the template for each feature point on the image, and extracts the feature point from the pattern on the image having a high degree of coincidence with the template.
  • the area calculation unit 26 calculates the area of the graphic defined by the feature points extracted by the feature point extraction unit 24.
  • Each figure is a polygon having three or more feature points as vertices.
  • the adjustment unit 28 adjusts the camera parameters based on the feature point coordinates extracted by the feature point extraction unit 24 and the figure area calculated by the area calculation unit 26.
  • the configuration of the camera calibration device 20 can be realized in hardware by a CPU, memory, or other LSI of an arbitrary computer, and is realized by a program loaded in the memory in software. Describes functional blocks realized by collaboration. Accordingly, those skilled in the art will understand that these functional blocks can be realized in various forms only by hardware, or by a combination of hardware and software.
  • FIG. 3A is a diagram illustrating an image I1 of calibration markers M1 and M2 in an ideal case where there is no error in the attachment position of the camera 10, and FIG. 3B is a case where there is an error in the yaw angle of the attachment position of the camera 10. It is a figure which shows the image I2 of marker M1, M2 for calibration. The images I1 and I2 are captured by the camera 10.
  • FIG. 3B shows the image I1 of marker M1, M2 for calibration.
  • the images I1 and I2 are captured by the camera 10.
  • an example of adjusting camera parameters in the case of FIG. 3B will be described.
  • FIG. 4A is a diagram in which the feature points P1C to P10C and P1 to P10 of FIG. 3A and FIG. 3B are overlapped
  • FIG. 4B is a diagram of the feature points P1C to P10C and P1 to P10 of FIG. 3A and FIG. It is a figure which shows F8C and F1-F8 in piles.
  • the calibration markers M1 and M2 are removed in FIGS. 4A and 4B.
  • the image I1 has ten feature points P1C to P10C.
  • the feature points P1 and P1C are located at the center of the calibration marker M1.
  • the feature points P2 and P2C are located at the center of the four squares at the upper left of the calibration marker M1, and the feature points P3 and P3C are located at the center of the four squares at the upper right of the calibration marker M1.
  • the feature points P4 and P4C are located at the center of the lower left four squares of the calibration marker M1, and the feature points P5 and P5C are located at the middle of the lower right four squares of the calibration marker M1. Since the relationship between the feature points P6 to P10, P6C to P10C and the calibration marker M2 is the same, the description thereof is omitted.
  • the figure F1 is a triangle having vertices at the feature points P1 to P3.
  • the figure F2 is a triangle having feature points P1, P4, and P5 as vertices.
  • the figure F3 is a triangle having the feature points P1, P2, and P4 as vertices.
  • the figure F4 is a triangle having feature points P1, P3, and P5 as vertices.
  • the figure F5 is a triangle having vertices at the feature points P6 to P8.
  • the figure F6 is a triangle having feature points P6, P9, and P10 as vertices.
  • the figure F7 is a triangle having vertices at the feature points P6, P7, and P9.
  • the figure F8 is a triangle having feature points P6, P8, and P10 as vertices.
  • the area calculation unit 26 in FIG. 2 includes the area of the pair of figures F1 and F2 arranged in the first direction (up and down direction) y and the second direction (left and right direction) in the image of the calibration marker M1. ) The area of a set of figures F3 and F4 arranged in x is calculated. Further, the area calculation unit 26 in the image of the calibration marker M2 has an area of a set of figures F5 and F6 arranged in the first direction y and an area of a set of figures F7 and F8 arranged in the second direction x. And are calculated. The second direction x intersects the first direction y.
  • each of the calibration markers M1 and M2 in the reference coordinate system (world coordinate system) with reference to the vehicle C1 is used.
  • the coordinates of the feature points are known.
  • the coordinates of the feature points on the image can be calculated.
  • it is assumed that the coordinates of the feature points on the image calculated using the initial camera parameters are equal to the coordinates of the ideal feature points P1C to P10C in FIG. 3A.
  • the adjustment unit 28 in FIG. 2 repeatedly adjusts the camera parameter so that the evaluation value becomes small, and brings the camera parameter close to the optimum value.
  • the evaluation value is the sum of the first evaluation value and the second evaluation value.
  • the first evaluation value is the sum of the distances L1 to L10 shown in FIG. 4A.
  • the distance L [i] (i is an integer of 1 to 10) is a distance between the calculated coordinates of the feature point P [i] C and the coordinates of the feature point P [i] extracted from the captured image. is there. In FIG. 4A, the distances L1 and L6 are not shown.
  • the second evaluation value is the sum of the area differences S1 to S8.
  • the area difference S [j] (j is an integer of 1 to 8) is calculated from the area of the figure F [j] C calculated from the calculated feature points P1C to P10C and the feature points P1 to P10 extracted from the image. It is the difference from the calculated area of the figure F [j].
  • a known method such as the steepest descent method can be used. 2 adjusts the camera parameters by determining that the evaluation value has converged, for example, when the evaluation value becomes substantially constant or when the evaluation value becomes smaller than a predetermined threshold value. finish.
  • the area difference S3 between the area of the graphic F3C and the area of the graphic F3 in FIG. Therefore, compared with the case where only the feature points are used, a change in the repetitive calculation can be grasped greatly, and the calibration accuracy can be improved.
  • the area difference S1 between the area of the figure F1C and the area of the figure F1 is smaller than the area difference S3 between the area of the figure F3C and the area of the figure F3.
  • the area difference S2 between the area of the figure F2C and the area of the figure F2 is small. That is, the areas of the figures F1 to F8 change with different tendencies due to the error of the yaw angle at the mounting position of the camera 10. Therefore, by adjusting the camera parameters so that the sum of the area differences S1 to S8 becomes small, the error of the yaw angle at the mounting position of the camera 10 can be calibrated with higher accuracy.
  • FIG. 5 is a flowchart showing processing of the camera calibration device 20 of FIG. 1A.
  • the feature point extraction unit 24 extracts a plurality of feature points P1 to P10 from the images of the calibration markers M1 and M2 captured by the camera 10 (S1).
  • the area calculation unit 26 calculates the areas of the graphics F1 to F8 defined by the extracted feature points P1 to P10 (S2).
  • the adjusting unit 28 adjusts the camera parameters based on the coordinates of the extracted feature points P1 to P10 and the calculated areas of the graphics F1 to F8 (S3).
  • the adjustment unit 28 determines whether or not the evaluation value has converged (S4). If the evaluation value has not converged (N in S4), the process returns to S3. When the evaluation value has converged (Y in S4), the adjustment unit 28 ends the process.
  • the adjusted camera parameters are stored in a storage unit (not shown) in the vehicle C1.
  • an image processing device (not shown) in the vehicle C1 corrects the image captured by the camera 10 using the stored camera parameters, and generates an image in which distortion due to an error in the mounting position of the camera 10 is corrected. To do.
  • the corrected image is used for the driver to confirm the rear of the vehicle C1.
  • the feature amount used for adjusting the camera parameter is increased as compared with the case where only the coordinates of the feature point are used.
  • the area changes differently from the feature points according to the error in the mounting position of the camera 10.
  • the amount of change in the area due to the error in the mounting position of the camera 10 is larger than the amount of change in the coordinates of the feature points due to the error in the mounting position of the camera 10. Therefore, the accuracy of calibration can be improved as compared with the case where only feature points are used.
  • the area of the set of figures F1 and F2 arranged in the first direction y and the set of figures arranged in the second direction x due to at least one of the pitch angle, the yaw angle, and the roll angle of the camera 10 are shifted.
  • the areas of F3 and F4 change with different tendencies.
  • the area of the set of figures F5 and F6 and the area of the set of figures F7 and F8 also change with different tendencies. Therefore, it is possible to calibrate the influence of errors in the pitch angle, yaw angle, and roll angle at the mounting position of the camera 10 more accurately.
  • one calibration marker may be used.
  • the calibration accuracy is lower than that in the above-described embodiment, the installation space for the calibration marker can be reduced.
  • three or more calibration markers may be used.
  • the calibration accuracy is improved as compared with the above-described embodiment, the installation space for the calibration marker is increased.
  • the specific pattern of the calibration marker is not particularly limited as long as three or more feature points that define at least one figure can be extracted from one calibration marker.
  • the calibration accuracy is lower than in the above-described embodiment, but the camera parameter adjustment time may be shorter.
  • the figure may be a polygon other than a triangle, or may have a different shape.
  • the first evaluation value may be a value based on the distances L1 to L10 shown in FIG. 4A, and may be an average of the distances L1 to L10.
  • the second evaluation value may be a value based on the area differences S1 to S8, and may be an average of the area differences S1 to S8.
  • the position where the camera 10 is attached may be the front part or the side part of the vehicle C1. Further, the camera 10 attached to a device other than the vehicle C1 may be calibrated by the camera calibration device 20.
  • One aspect of the present disclosure is as follows.
  • a feature point extraction unit that extracts a plurality of feature points from a calibration marker image captured by a camera; An area calculation unit for calculating an area of a figure defined by the feature points extracted by the feature point extraction unit; An adjustment unit that adjusts camera parameters for calibrating the camera based on the coordinates of the feature points extracted by the feature point extraction unit and the area of the figure calculated by the area calculation unit; A camera calibration device.
  • the area changes differently from the feature point according to the camera mounting position error
  • the amount of change in the area due to the camera mounting position error is the coordinate of the feature point due to the camera mounting position error. Since it is larger than the amount of change, the accuracy of calibration can be improved.
  • the area of the set of figures arranged in the first direction and the area of the set of figures arranged in the second direction due to the shift of at least one of the pitch angle, the yaw angle, and the roll angle of the camera, Since they change with different tendencies, the effects of camera mounting position errors can be calibrated with higher accuracy.
  • a camera calibration method comprising:
  • the area changes differently from the feature point according to the camera mounting position error
  • the amount of change in the area due to the camera mounting position error is the coordinate of the feature point due to the camera mounting position error. Since it is larger than the amount of change, the accuracy of calibration can be improved.
  • the camera calibration apparatus and the camera calibration method according to the present disclosure can improve the calibration accuracy of the camera, and thus can be applied to the calibration of a camera mounted on a moving body such as an automobile.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Studio Devices (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Image Analysis (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Image Processing (AREA)
PCT/JP2017/001637 2016-01-29 2017-01-19 カメラ校正装置およびカメラ校正方法 Ceased WO2017130823A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201780007829.6A CN108496354B (zh) 2016-01-29 2017-01-19 照相机校正装置和照相机校正方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-016382 2016-01-29
JP2016016382A JP6688989B2 (ja) 2016-01-29 2016-01-29 カメラ校正装置およびカメラ校正方法

Publications (1)

Publication Number Publication Date
WO2017130823A1 true WO2017130823A1 (ja) 2017-08-03

Family

ID=59397764

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/001637 Ceased WO2017130823A1 (ja) 2016-01-29 2017-01-19 カメラ校正装置およびカメラ校正方法

Country Status (3)

Country Link
JP (1) JP6688989B2 (enExample)
CN (1) CN108496354B (enExample)
WO (1) WO2017130823A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3742192B1 (en) * 2018-09-28 2023-09-06 NEXION S.p.A. System for calibrating a vehicle camera

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10504244B2 (en) * 2017-09-28 2019-12-10 Baidu Usa Llc Systems and methods to improve camera intrinsic parameter calibration
CN109272474B (zh) * 2018-11-21 2022-04-12 大陆汽车车身电子系统(芜湖)有限公司 确定成像系统校正参数方法及成像系统的预校正方法
CN110827357B (zh) * 2019-09-30 2024-03-29 深圳市安思疆科技有限公司 一种组合图案标定板以及结构光相机参数标定方法

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08110206A (ja) * 1994-10-12 1996-04-30 Ricoh Co Ltd 位置及び姿勢検出方法と位置及び姿勢検出装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003078811A (ja) * 2001-09-04 2003-03-14 Nippon Hoso Kyokai <Nhk> マーカ座標対応付け方法とカメラパラメータ取得方法とカメラパラメータ取得システムとキャリブレーションパターン
CN100557635C (zh) * 2008-06-10 2009-11-04 北京航空航天大学 一种基于柔性立体靶标的摄像机标定方法
JP5091902B2 (ja) * 2009-03-31 2012-12-05 アイシン精機株式会社 車載カメラの校正に用いられる校正指標と、当該校正指標を用いた車載カメラの校正方法と、システムと、当該システムのためのプログラム
JP4751939B2 (ja) * 2009-03-31 2011-08-17 アイシン精機株式会社 車載カメラの校正装置
CN103854271B (zh) * 2012-11-28 2016-08-31 华中科技大学 一种平面摄像机标定方法
CN104008548B (zh) * 2014-06-04 2017-04-19 无锡维森智能传感技术有限公司 一种用于车载环视系统摄像头参数标定的特征点抽取方法
CN104217429A (zh) * 2014-08-25 2014-12-17 太仓中科信息技术研究院 一种摄像机标定板的设计与检测方法

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08110206A (ja) * 1994-10-12 1996-04-30 Ricoh Co Ltd 位置及び姿勢検出方法と位置及び姿勢検出装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3742192B1 (en) * 2018-09-28 2023-09-06 NEXION S.p.A. System for calibrating a vehicle camera

Also Published As

Publication number Publication date
JP2017135680A (ja) 2017-08-03
CN108496354A (zh) 2018-09-04
CN108496354B (zh) 2020-12-08
JP6688989B2 (ja) 2020-04-28

Similar Documents

Publication Publication Date Title
CN108292439B (zh) 校准安装至车辆的摄像机的取向的方法和存储介质
CN111337906B (zh) 传感器校准方法和传感器校准设备
US10192309B2 (en) Camera calibration device
EP3260882B1 (en) Radar apparatus mounted on moving body and azimuth angle correction method for use in radar apparatus mounted on moving body
CN105445721B (zh) 基于带特征突起v型标定物的激光雷达与摄像机联合标定方法
JP6348093B2 (ja) 入力データから検出対象物の像を検出する画像処理装置および方法
EP3692396B1 (en) Distance measurement using a longitudinal grid pattern
US20190141313A1 (en) Calibration method, calibration device, and computer program product
WO2017130823A1 (ja) カメラ校正装置およびカメラ校正方法
JP6536529B2 (ja) 車載カメラのキャリブレーション装置及び車載カメラのキャリブレーション方法
JP2009204532A (ja) 距離画像センサの校正装置及び校正方法
CN110361717A (zh) 激光雷达-摄像机联合标定靶和联合标定方法
CN106846412B (zh) 一种棋盘格角点检测方法及装置
JP2016200557A (ja) 校正装置、距離計測装置及び校正方法
Maroli et al. Automated rotational calibration of multiple 3D LIDAR units for intelligent vehicles
CN114119771A (zh) 一种毫米波雷达和相机联合标定方法
JP2016111585A (ja) 画像処理装置、システム、画像処理方法、およびプログラム
JP7069943B2 (ja) 方位検出システム、方位検出方法、及び方位検出プログラム
JP2018125706A (ja) 撮像装置
JP7318522B2 (ja) 推定装置、推定方法、推定プログラム
US10126646B2 (en) Method of calculating a shift value of a cell contact
JP7318521B2 (ja) 推定装置、推定方法、推定プログラム
Ishii et al. A practical calibration method for top view image generation
CN117805785A (zh) 一种斜对角安装的双激光雷达标定方法和装置
KR20160104896A (ko) 탱크 제작장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17744053

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17744053

Country of ref document: EP

Kind code of ref document: A1