WO2017130823A1 - Camera calibration device and camera calibration method - Google Patents

Camera calibration device and camera calibration method Download PDF

Info

Publication number
WO2017130823A1
WO2017130823A1 PCT/JP2017/001637 JP2017001637W WO2017130823A1 WO 2017130823 A1 WO2017130823 A1 WO 2017130823A1 JP 2017001637 W JP2017001637 W JP 2017001637W WO 2017130823 A1 WO2017130823 A1 WO 2017130823A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
area
feature points
calibration
extracted
Prior art date
Application number
PCT/JP2017/001637
Other languages
French (fr)
Japanese (ja)
Inventor
将由 道口
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to CN201780007829.6A priority Critical patent/CN108496354B/en
Publication of WO2017130823A1 publication Critical patent/WO2017130823A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to a camera calibration apparatus and a camera calibration method for calibrating a camera.
  • a technique is known in which a camera attached to a vehicle is used to image the surroundings of the vehicle and driving assistance is performed using the captured images.
  • a deviation from a design value occurs in a captured image due to an error in the mounting position of the camera and a manufacturing error in the camera itself. For this reason, it is necessary to calibrate the camera mounting position error in advance in a factory or the like.
  • the calibration marker is imaged by the camera, the feature points are extracted from the captured calibration marker image, and the camera parameters for calibrating the camera are adjusted based on the coordinates of the extracted feature points.
  • the camera calibration device includes a feature point extraction unit, an area calculation unit, and an adjustment unit.
  • the feature point extraction unit extracts a plurality of feature points from the calibration marker image captured by the camera.
  • the area calculation unit calculates the area of the figure defined by the feature points extracted by the feature point extraction unit.
  • the adjustment unit adjusts camera parameters for calibrating the camera based on the coordinates of the feature points extracted by the feature point extraction unit and the area of the graphic calculated by the area calculation unit.
  • Another aspect of the present disclosure is a camera calibration method. This method extracts a plurality of feature points from an image of a calibration marker imaged by a camera, calculates an area of a figure defined by the extracted feature points, and coordinates of the extracted feature points And adjusting camera parameters for calibrating the camera based on the calculated area of the figure.
  • the calibration accuracy of the camera can be improved.
  • FIG. 1A is a diagram illustrating a positional relationship between a vehicle and a calibration marker at the time of calibration according to an embodiment.
  • FIG. 1B is a plan view showing an example of the calibration marker of FIG. 1A.
  • FIG. 2 is a block diagram showing a schematic configuration of the camera calibration apparatus of FIG. 1A.
  • FIG. 3A is a diagram illustrating an image of a calibration marker in an ideal case where there is no error in the camera mounting position.
  • FIG. 3B is a diagram illustrating an image of the calibration marker in the case where there is an error in the yaw angle at the camera mounting position.
  • FIG. 4A is a diagram showing the feature points of FIG. 3A and FIG. 3B in an overlapping manner.
  • FIG. 4B is a diagram in which the feature points and figures of FIGS. 3A and 3B are overlapped.
  • FIG. 5 is a flowchart showing the processing of the camera calibration apparatus of FIG. 1A.
  • the present disclosure has been made in view of such circumstances, and the present disclosure provides a technique that can improve the calibration accuracy of a camera.
  • FIG. 1A is a diagram illustrating a positional relationship between the vehicle C1 and calibration markers M1 and M2 during calibration according to an embodiment
  • FIG. 1B is a plane illustrating an example of the calibration markers M1 and M2 in FIG. 1A.
  • FIG. 1A the surroundings of the vehicle C1 when the camera 10 is calibrated in a production factory of the vehicle C1 are viewed from above.
  • the vehicle C ⁇ b> 1 includes a camera 10 and a camera calibration device 20.
  • the camera 10 is attached to a back door or the like at the rear of the vehicle C1, and images the rear of the vehicle C1.
  • the camera 10 may be attached near the center axis of the vehicle, or may be attached offset from the center axis of the vehicle.
  • the camera calibration device 20 adjusts camera parameters for calibrating the camera 10.
  • the calibration markers M1 and M2 are arranged substantially perpendicular to the floor surface at a predetermined position behind the vehicle C1 within the imaging range of the camera 10.
  • the calibration markers M1 and M2 are arranged substantially symmetrically on both sides of the center axis of the vehicle.
  • each calibration marker M1, M2 has a checkered pattern in which 16 squares are arranged in a matrix.
  • the camera 10 images this checkerboard pattern.
  • FIG. 2 is a block diagram showing a schematic configuration of the camera calibration device 20 of FIG. 1A.
  • the camera calibration device 20 includes an image storage unit 22, a feature point extraction unit 24, an area calculation unit 26, and an adjustment unit 28.
  • the image storage unit 22 stores images of the calibration markers M1 and M2 captured by the camera 10.
  • the feature point extraction unit 24 extracts a plurality of feature points from the images of the calibration markers M1 and M2 stored in the image storage unit 22.
  • the method for extracting the feature points is not particularly limited, but the feature point extraction unit 24 may extract the feature points using a pattern matching technique.
  • the feature point extraction unit 24 scans the template for each feature point on the image, and extracts the feature point from the pattern on the image having a high degree of coincidence with the template.
  • the area calculation unit 26 calculates the area of the graphic defined by the feature points extracted by the feature point extraction unit 24.
  • Each figure is a polygon having three or more feature points as vertices.
  • the adjustment unit 28 adjusts the camera parameters based on the feature point coordinates extracted by the feature point extraction unit 24 and the figure area calculated by the area calculation unit 26.
  • the configuration of the camera calibration device 20 can be realized in hardware by a CPU, memory, or other LSI of an arbitrary computer, and is realized by a program loaded in the memory in software. Describes functional blocks realized by collaboration. Accordingly, those skilled in the art will understand that these functional blocks can be realized in various forms only by hardware, or by a combination of hardware and software.
  • FIG. 3A is a diagram illustrating an image I1 of calibration markers M1 and M2 in an ideal case where there is no error in the attachment position of the camera 10, and FIG. 3B is a case where there is an error in the yaw angle of the attachment position of the camera 10. It is a figure which shows the image I2 of marker M1, M2 for calibration. The images I1 and I2 are captured by the camera 10.
  • FIG. 3B shows the image I1 of marker M1, M2 for calibration.
  • the images I1 and I2 are captured by the camera 10.
  • an example of adjusting camera parameters in the case of FIG. 3B will be described.
  • FIG. 4A is a diagram in which the feature points P1C to P10C and P1 to P10 of FIG. 3A and FIG. 3B are overlapped
  • FIG. 4B is a diagram of the feature points P1C to P10C and P1 to P10 of FIG. 3A and FIG. It is a figure which shows F8C and F1-F8 in piles.
  • the calibration markers M1 and M2 are removed in FIGS. 4A and 4B.
  • the image I1 has ten feature points P1C to P10C.
  • the feature points P1 and P1C are located at the center of the calibration marker M1.
  • the feature points P2 and P2C are located at the center of the four squares at the upper left of the calibration marker M1, and the feature points P3 and P3C are located at the center of the four squares at the upper right of the calibration marker M1.
  • the feature points P4 and P4C are located at the center of the lower left four squares of the calibration marker M1, and the feature points P5 and P5C are located at the middle of the lower right four squares of the calibration marker M1. Since the relationship between the feature points P6 to P10, P6C to P10C and the calibration marker M2 is the same, the description thereof is omitted.
  • the figure F1 is a triangle having vertices at the feature points P1 to P3.
  • the figure F2 is a triangle having feature points P1, P4, and P5 as vertices.
  • the figure F3 is a triangle having the feature points P1, P2, and P4 as vertices.
  • the figure F4 is a triangle having feature points P1, P3, and P5 as vertices.
  • the figure F5 is a triangle having vertices at the feature points P6 to P8.
  • the figure F6 is a triangle having feature points P6, P9, and P10 as vertices.
  • the figure F7 is a triangle having vertices at the feature points P6, P7, and P9.
  • the figure F8 is a triangle having feature points P6, P8, and P10 as vertices.
  • the area calculation unit 26 in FIG. 2 includes the area of the pair of figures F1 and F2 arranged in the first direction (up and down direction) y and the second direction (left and right direction) in the image of the calibration marker M1. ) The area of a set of figures F3 and F4 arranged in x is calculated. Further, the area calculation unit 26 in the image of the calibration marker M2 has an area of a set of figures F5 and F6 arranged in the first direction y and an area of a set of figures F7 and F8 arranged in the second direction x. And are calculated. The second direction x intersects the first direction y.
  • each of the calibration markers M1 and M2 in the reference coordinate system (world coordinate system) with reference to the vehicle C1 is used.
  • the coordinates of the feature points are known.
  • the coordinates of the feature points on the image can be calculated.
  • it is assumed that the coordinates of the feature points on the image calculated using the initial camera parameters are equal to the coordinates of the ideal feature points P1C to P10C in FIG. 3A.
  • the adjustment unit 28 in FIG. 2 repeatedly adjusts the camera parameter so that the evaluation value becomes small, and brings the camera parameter close to the optimum value.
  • the evaluation value is the sum of the first evaluation value and the second evaluation value.
  • the first evaluation value is the sum of the distances L1 to L10 shown in FIG. 4A.
  • the distance L [i] (i is an integer of 1 to 10) is a distance between the calculated coordinates of the feature point P [i] C and the coordinates of the feature point P [i] extracted from the captured image. is there. In FIG. 4A, the distances L1 and L6 are not shown.
  • the second evaluation value is the sum of the area differences S1 to S8.
  • the area difference S [j] (j is an integer of 1 to 8) is calculated from the area of the figure F [j] C calculated from the calculated feature points P1C to P10C and the feature points P1 to P10 extracted from the image. It is the difference from the calculated area of the figure F [j].
  • a known method such as the steepest descent method can be used. 2 adjusts the camera parameters by determining that the evaluation value has converged, for example, when the evaluation value becomes substantially constant or when the evaluation value becomes smaller than a predetermined threshold value. finish.
  • the area difference S3 between the area of the graphic F3C and the area of the graphic F3 in FIG. Therefore, compared with the case where only the feature points are used, a change in the repetitive calculation can be grasped greatly, and the calibration accuracy can be improved.
  • the area difference S1 between the area of the figure F1C and the area of the figure F1 is smaller than the area difference S3 between the area of the figure F3C and the area of the figure F3.
  • the area difference S2 between the area of the figure F2C and the area of the figure F2 is small. That is, the areas of the figures F1 to F8 change with different tendencies due to the error of the yaw angle at the mounting position of the camera 10. Therefore, by adjusting the camera parameters so that the sum of the area differences S1 to S8 becomes small, the error of the yaw angle at the mounting position of the camera 10 can be calibrated with higher accuracy.
  • FIG. 5 is a flowchart showing processing of the camera calibration device 20 of FIG. 1A.
  • the feature point extraction unit 24 extracts a plurality of feature points P1 to P10 from the images of the calibration markers M1 and M2 captured by the camera 10 (S1).
  • the area calculation unit 26 calculates the areas of the graphics F1 to F8 defined by the extracted feature points P1 to P10 (S2).
  • the adjusting unit 28 adjusts the camera parameters based on the coordinates of the extracted feature points P1 to P10 and the calculated areas of the graphics F1 to F8 (S3).
  • the adjustment unit 28 determines whether or not the evaluation value has converged (S4). If the evaluation value has not converged (N in S4), the process returns to S3. When the evaluation value has converged (Y in S4), the adjustment unit 28 ends the process.
  • the adjusted camera parameters are stored in a storage unit (not shown) in the vehicle C1.
  • an image processing device (not shown) in the vehicle C1 corrects the image captured by the camera 10 using the stored camera parameters, and generates an image in which distortion due to an error in the mounting position of the camera 10 is corrected. To do.
  • the corrected image is used for the driver to confirm the rear of the vehicle C1.
  • the feature amount used for adjusting the camera parameter is increased as compared with the case where only the coordinates of the feature point are used.
  • the area changes differently from the feature points according to the error in the mounting position of the camera 10.
  • the amount of change in the area due to the error in the mounting position of the camera 10 is larger than the amount of change in the coordinates of the feature points due to the error in the mounting position of the camera 10. Therefore, the accuracy of calibration can be improved as compared with the case where only feature points are used.
  • the area of the set of figures F1 and F2 arranged in the first direction y and the set of figures arranged in the second direction x due to at least one of the pitch angle, the yaw angle, and the roll angle of the camera 10 are shifted.
  • the areas of F3 and F4 change with different tendencies.
  • the area of the set of figures F5 and F6 and the area of the set of figures F7 and F8 also change with different tendencies. Therefore, it is possible to calibrate the influence of errors in the pitch angle, yaw angle, and roll angle at the mounting position of the camera 10 more accurately.
  • one calibration marker may be used.
  • the calibration accuracy is lower than that in the above-described embodiment, the installation space for the calibration marker can be reduced.
  • three or more calibration markers may be used.
  • the calibration accuracy is improved as compared with the above-described embodiment, the installation space for the calibration marker is increased.
  • the specific pattern of the calibration marker is not particularly limited as long as three or more feature points that define at least one figure can be extracted from one calibration marker.
  • the calibration accuracy is lower than in the above-described embodiment, but the camera parameter adjustment time may be shorter.
  • the figure may be a polygon other than a triangle, or may have a different shape.
  • the first evaluation value may be a value based on the distances L1 to L10 shown in FIG. 4A, and may be an average of the distances L1 to L10.
  • the second evaluation value may be a value based on the area differences S1 to S8, and may be an average of the area differences S1 to S8.
  • the position where the camera 10 is attached may be the front part or the side part of the vehicle C1. Further, the camera 10 attached to a device other than the vehicle C1 may be calibrated by the camera calibration device 20.
  • One aspect of the present disclosure is as follows.
  • a feature point extraction unit that extracts a plurality of feature points from a calibration marker image captured by a camera; An area calculation unit for calculating an area of a figure defined by the feature points extracted by the feature point extraction unit; An adjustment unit that adjusts camera parameters for calibrating the camera based on the coordinates of the feature points extracted by the feature point extraction unit and the area of the figure calculated by the area calculation unit; A camera calibration device.
  • the area changes differently from the feature point according to the camera mounting position error
  • the amount of change in the area due to the camera mounting position error is the coordinate of the feature point due to the camera mounting position error. Since it is larger than the amount of change, the accuracy of calibration can be improved.
  • the area of the set of figures arranged in the first direction and the area of the set of figures arranged in the second direction due to the shift of at least one of the pitch angle, the yaw angle, and the roll angle of the camera, Since they change with different tendencies, the effects of camera mounting position errors can be calibrated with higher accuracy.
  • a camera calibration method comprising:
  • the area changes differently from the feature point according to the camera mounting position error
  • the amount of change in the area due to the camera mounting position error is the coordinate of the feature point due to the camera mounting position error. Since it is larger than the amount of change, the accuracy of calibration can be improved.
  • the camera calibration apparatus and the camera calibration method according to the present disclosure can improve the calibration accuracy of the camera, and thus can be applied to the calibration of a camera mounted on a moving body such as an automobile.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Studio Devices (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Image Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A camera calibration device provided with a feature point extraction unit, an area calculation unit, and an adjustment unit. The feature point extraction unit extracts a plurality of feature points from an image of a marker for calibration that is captured by a camera. The area calculation unit calculates the area of a diagram defined by the extracted feature points. The adjustment unit adjusts, on the basis of the coordinates of the extracted feature points and the calculated area of the diagram, a camera parameter for calibrating the camera.

Description

カメラ校正装置およびカメラ校正方法Camera calibration apparatus and camera calibration method
 本開示は、カメラの校正を行うカメラ校正装置およびカメラ校正方法に関する。 The present disclosure relates to a camera calibration apparatus and a camera calibration method for calibrating a camera.
 車両に取り付けられたカメラで車両の周囲を撮像し、撮像された画像を用いて運転支援を行う技術が知られている。このような技術では、カメラの取り付け位置の誤差とカメラ自体の製造誤差などにより、撮像された画像に設計値とのずれが生じる。そのため、工場などにおいてカメラの取り付け位置の誤差などを予め校正しておく必要がある。校正の際、校正用マーカをカメラで撮像し、撮像された校正用マーカの画像から特徴点を抽出し、抽出された特徴点の座標に基づいて、カメラを校正するためのカメラパラメータを調整する(例えば、特許文献1参照)。撮像された画像における校正用マーカの画像の占有割合が大きく、特徴点が均等かつ高密度に分布しているほど、校正精度を向上することができる。 A technique is known in which a camera attached to a vehicle is used to image the surroundings of the vehicle and driving assistance is performed using the captured images. In such a technique, a deviation from a design value occurs in a captured image due to an error in the mounting position of the camera and a manufacturing error in the camera itself. For this reason, it is necessary to calibrate the camera mounting position error in advance in a factory or the like. At the time of calibration, the calibration marker is imaged by the camera, the feature points are extracted from the captured calibration marker image, and the camera parameters for calibrating the camera are adjusted based on the coordinates of the extracted feature points. (For example, refer to Patent Document 1). The larger the occupation ratio of the calibration marker image in the captured image and the more uniformly and densely distributed feature points, the more the calibration accuracy can be improved.
特開2011-155687号公報JP 2011-155687 A
 本開示のある態様のカメラ校正装置は、特徴点抽出部と、面積算出部と、調整部と、を備える。特徴点抽出部は、カメラで撮像された校正用マーカの画像から複数の特徴点を抽出する。面積算出部は、特徴点抽出部で抽出された特徴点によって規定される図形の面積を算出する。調整部は、特徴点抽出部で抽出された特徴点の座標と、面積算出部で算出された図形の面積とに基づいて、カメラを校正するためのカメラパラメータを調整する。 The camera calibration device according to an aspect of the present disclosure includes a feature point extraction unit, an area calculation unit, and an adjustment unit. The feature point extraction unit extracts a plurality of feature points from the calibration marker image captured by the camera. The area calculation unit calculates the area of the figure defined by the feature points extracted by the feature point extraction unit. The adjustment unit adjusts camera parameters for calibrating the camera based on the coordinates of the feature points extracted by the feature point extraction unit and the area of the graphic calculated by the area calculation unit.
 本開示の別の態様は、カメラ校正方法である。この方法は、カメラで撮像された校正用マーカの画像から複数の特徴点を抽出することと、抽出された特徴点によって規定される図形の面積を算出することと、抽出された特徴点の座標と、算出された図形の面積とに基づいて、カメラを校正するためのカメラパラメータを調整することとを備える。 Another aspect of the present disclosure is a camera calibration method. This method extracts a plurality of feature points from an image of a calibration marker imaged by a camera, calculates an area of a figure defined by the extracted feature points, and coordinates of the extracted feature points And adjusting camera parameters for calibrating the camera based on the calculated area of the figure.
 本開示によれば、カメラの校正精度を向上できる。 According to the present disclosure, the calibration accuracy of the camera can be improved.
図1Aは、一実施形態に係る校正の際の車両と校正用マーカとの位置関係を示す図である。FIG. 1A is a diagram illustrating a positional relationship between a vehicle and a calibration marker at the time of calibration according to an embodiment. 図1Bは、図1Aの校正用マーカの一例を示す平面図である。FIG. 1B is a plan view showing an example of the calibration marker of FIG. 1A. 図2は、図1Aのカメラ校正装置の概略的な構成を示すブロック図である。FIG. 2 is a block diagram showing a schematic configuration of the camera calibration apparatus of FIG. 1A. 図3Aは、カメラの取り付け位置の誤差がない理想的な場合の校正用マーカの画像を示す図である。FIG. 3A is a diagram illustrating an image of a calibration marker in an ideal case where there is no error in the camera mounting position. 図3Bは、カメラの取り付け位置のヨー角の誤差がある場合の校正用マーカの画像を示す図である。FIG. 3B is a diagram illustrating an image of the calibration marker in the case where there is an error in the yaw angle at the camera mounting position. 図4Aは、図3Aと図3Bの特徴点を重ねて示す図である。FIG. 4A is a diagram showing the feature points of FIG. 3A and FIG. 3B in an overlapping manner. 図4Bは、図3Aと図3Bの特徴点と図形を重ねて示す図である。FIG. 4B is a diagram in which the feature points and figures of FIGS. 3A and 3B are overlapped. 図5は、図1Aのカメラ校正装置の処理を示すフローチャートである。FIG. 5 is a flowchart showing the processing of the camera calibration apparatus of FIG. 1A.
 本開示の実施の形態の説明に先立ち、従来のカメラ校正装置における問題点を簡単に説明する。カメラの校正において、設置スペースの制約から、小型の校正用マーカを用いることが望まれている。校正用マーカが小さくなるほど、抽出できる特徴点の数が少なくなるとともに、特徴点は撮像された画像の一部に密集する。この場合、カメラの取り付け位置の誤差による特徴点の動きは、レンズ歪みの影響を受けにくく、平行移動に近くなるため、校正精度が低下する。 Prior to the description of the embodiment of the present disclosure, problems in the conventional camera calibration apparatus will be briefly described. In camera calibration, it is desired to use a small calibration marker because of the limited installation space. The smaller the calibration marker, the smaller the number of feature points that can be extracted, and the feature points are concentrated in a part of the captured image. In this case, the movement of the feature point due to the error of the camera mounting position is not easily affected by the lens distortion and is close to the parallel movement, so that the calibration accuracy is lowered.
 本開示はこうした状況に鑑みてなされたものであり、本開示は、カメラの校正精度を向上できる技術を提供する。 The present disclosure has been made in view of such circumstances, and the present disclosure provides a technique that can improve the calibration accuracy of a camera.
 図1Aは、一実施形態に係る校正の際の車両C1と校正用マーカM1,M2との位置関係を示す図であり、図1Bは、図1Aの校正用マーカM1,M2の一例を示す平面図である。図1Aでは、車両C1の生産工場などにおいてカメラ10の校正が行われる時の車両C1の周囲を上方から俯瞰している。車両C1は、カメラ10と、カメラ校正装置20と、を備える。カメラ10は、車両C1の後部のバックドアなどに取り付けられ、車両C1の後方を撮像する。カメラ10は、車両の中心軸付近に取り付けられてもよいし、車両の中心軸からオフセットして取り付けられてもよい。カメラ校正装置20は、カメラ10を校正するためのカメラパラメータを調整する。 FIG. 1A is a diagram illustrating a positional relationship between the vehicle C1 and calibration markers M1 and M2 during calibration according to an embodiment, and FIG. 1B is a plane illustrating an example of the calibration markers M1 and M2 in FIG. 1A. FIG. In FIG. 1A, the surroundings of the vehicle C1 when the camera 10 is calibrated in a production factory of the vehicle C1 are viewed from above. The vehicle C <b> 1 includes a camera 10 and a camera calibration device 20. The camera 10 is attached to a back door or the like at the rear of the vehicle C1, and images the rear of the vehicle C1. The camera 10 may be attached near the center axis of the vehicle, or may be attached offset from the center axis of the vehicle. The camera calibration device 20 adjusts camera parameters for calibrating the camera 10.
 校正用マーカM1,M2は、カメラ10の撮像範囲内における車両C1の後方の予め定められた位置に、床面に対してほぼ垂直に配置されている。校正用マーカM1,M2は、車両の中心軸の両側にほぼ対称に配置されている。 The calibration markers M1 and M2 are arranged substantially perpendicular to the floor surface at a predetermined position behind the vehicle C1 within the imaging range of the camera 10. The calibration markers M1 and M2 are arranged substantially symmetrically on both sides of the center axis of the vehicle.
 図1Bに示すように、各校正用マーカM1,M2は、16個の正方形がマトリックス状に配列された市松模様を有している。カメラ10は、この市松模様を撮像する。 As shown in FIG. 1B, each calibration marker M1, M2 has a checkered pattern in which 16 squares are arranged in a matrix. The camera 10 images this checkerboard pattern.
 図2は、図1Aのカメラ校正装置20の概略的な構成を示すブロック図である。カメラ校正装置20は、画像記憶部22と、特徴点抽出部24と、面積算出部26と、調整部28と、を備える。 FIG. 2 is a block diagram showing a schematic configuration of the camera calibration device 20 of FIG. 1A. The camera calibration device 20 includes an image storage unit 22, a feature point extraction unit 24, an area calculation unit 26, and an adjustment unit 28.
 画像記憶部22は、カメラ10で撮像された校正用マーカM1,M2の画像を記憶する。 The image storage unit 22 stores images of the calibration markers M1 and M2 captured by the camera 10.
 特徴点抽出部24は、画像記憶部22に記憶された校正用マーカM1,M2の画像から複数の特徴点を抽出する。特徴点を抽出する方法は特に限定されないが、特徴点抽出部24は、パターンマッチング技術を用いて特徴点を抽出してもよい。パターンマッチング技術を用いる場合、特徴点抽出部24は、各特徴点用のテンプレートを画像上で走査させて、そのテンプレートと一致度の高い画像上のパターンから特徴点を抽出する。 The feature point extraction unit 24 extracts a plurality of feature points from the images of the calibration markers M1 and M2 stored in the image storage unit 22. The method for extracting the feature points is not particularly limited, but the feature point extraction unit 24 may extract the feature points using a pattern matching technique. When the pattern matching technique is used, the feature point extraction unit 24 scans the template for each feature point on the image, and extracts the feature point from the pattern on the image having a high degree of coincidence with the template.
 面積算出部26は、特徴点抽出部24で抽出された特徴点によって規定される図形の面積を算出する。各図形は、3つ以上の特徴点を頂点とする多角形である。 The area calculation unit 26 calculates the area of the graphic defined by the feature points extracted by the feature point extraction unit 24. Each figure is a polygon having three or more feature points as vertices.
 調整部28は、特徴点抽出部24で抽出された特徴点の座標と、面積算出部26で算出された図形の面積とに基づいて、カメラパラメータを調整する。 The adjustment unit 28 adjusts the camera parameters based on the feature point coordinates extracted by the feature point extraction unit 24 and the figure area calculated by the area calculation unit 26.
 面積算出部26と調整部28の詳細な機能は後述する。 Detailed functions of the area calculation unit 26 and the adjustment unit 28 will be described later.
 カメラ校正装置20の構成は、ハードウエア的には、任意のコンピュータのCPU、メモリ、その他のLSIで実現でき、ソフトウエア的にはメモリにロードされたプログラムなどによって実現されるが、ここではそれらの連携によって実現される機能ブロックを描いている。したがって、これらの機能ブロックがハードウエアのみ、ハードウエアとソフトウエアの組合せによっていろいろな形で実現できることは、当業者には理解されるところである。 The configuration of the camera calibration device 20 can be realized in hardware by a CPU, memory, or other LSI of an arbitrary computer, and is realized by a program loaded in the memory in software. Describes functional blocks realized by collaboration. Accordingly, those skilled in the art will understand that these functional blocks can be realized in various forms only by hardware, or by a combination of hardware and software.
 カメラパラメータの調整について、図3A,図3Bと図4A,図4Bを参照して説明する。 Adjustment of camera parameters will be described with reference to FIGS. 3A and 3B and FIGS. 4A and 4B.
 図3Aは、カメラ10の取り付け位置の誤差がない理想的な場合の校正用マーカM1,M2の画像I1を示す図であり、図3Bは、カメラ10の取り付け位置のヨー角の誤差がある場合の校正用マーカM1,M2の画像I2を示す図である。画像I1,I2は、カメラ10で撮像されている。以下では、図3Bの場合においてカメラパラメータを調整する一例について説明する。 FIG. 3A is a diagram illustrating an image I1 of calibration markers M1 and M2 in an ideal case where there is no error in the attachment position of the camera 10, and FIG. 3B is a case where there is an error in the yaw angle of the attachment position of the camera 10. It is a figure which shows the image I2 of marker M1, M2 for calibration. The images I1 and I2 are captured by the camera 10. Hereinafter, an example of adjusting camera parameters in the case of FIG. 3B will be described.
 図4Aは、図3Aと図3Bの特徴点P1C~P10C,P1~P10を重ねて示す図であり、図4Bは、図3Aと図3Bの特徴点P1C~P10C,P1~P10と図形F1C~F8C,F1~F8を重ねて示す図である。説明を明確化するため、図4A,図4Bにおいては、校正用マーカM1,M2を除去している。 4A is a diagram in which the feature points P1C to P10C and P1 to P10 of FIG. 3A and FIG. 3B are overlapped, and FIG. 4B is a diagram of the feature points P1C to P10C and P1 to P10 of FIG. 3A and FIG. It is a figure which shows F8C and F1-F8 in piles. In order to clarify the explanation, the calibration markers M1 and M2 are removed in FIGS. 4A and 4B.
 図3A,図3Bに示すように、画像I1には、10個の特徴点P1C~P10Cが存在する。画像I2には、10個の特徴点P1~P10が存在する。特徴点P1,P1Cは、校正用マーカM1の中央に位置する。特徴点P2,P2Cは、校正用マーカM1の左上の4つの正方形の中央に位置し、特徴点P3,P3Cは、校正用マーカM1の右上の4つの正方形の中央に位置する。特徴点P4,P4Cは、校正用マーカM1の左下の4つの正方形の中央に位置し、特徴点P5,P5Cは、校正用マーカM1の右下の4つの正方形の中央に位置する。特徴点P6~P10,P6C~P10Cと校正用マーカM2との関係も同様であるため説明を省略する。 As shown in FIGS. 3A and 3B, the image I1 has ten feature points P1C to P10C. In the image I2, there are ten feature points P1 to P10. The feature points P1 and P1C are located at the center of the calibration marker M1. The feature points P2 and P2C are located at the center of the four squares at the upper left of the calibration marker M1, and the feature points P3 and P3C are located at the center of the four squares at the upper right of the calibration marker M1. The feature points P4 and P4C are located at the center of the lower left four squares of the calibration marker M1, and the feature points P5 and P5C are located at the middle of the lower right four squares of the calibration marker M1. Since the relationship between the feature points P6 to P10, P6C to P10C and the calibration marker M2 is the same, the description thereof is omitted.
 図形F1は、特徴点P1~P3を頂点とする三角形である。図形F2は、特徴点P1,P4,P5を頂点とする三角形である。図形F3は、特徴点P1,P2,P4を頂点とする三角形である。図形F4は、特徴点P1,P3,P5を頂点とする三角形である。 The figure F1 is a triangle having vertices at the feature points P1 to P3. The figure F2 is a triangle having feature points P1, P4, and P5 as vertices. The figure F3 is a triangle having the feature points P1, P2, and P4 as vertices. The figure F4 is a triangle having feature points P1, P3, and P5 as vertices.
 図形F5は、特徴点P6~P8を頂点とする三角形である。図形F6は、特徴点P6,P9,P10を頂点とする三角形である。図形F7は、特徴点P6,P7,P9を頂点とする三角形である。図形F8は、特徴点P6,P8,P10を頂点とする三角形である。 The figure F5 is a triangle having vertices at the feature points P6 to P8. The figure F6 is a triangle having feature points P6, P9, and P10 as vertices. The figure F7 is a triangle having vertices at the feature points P6, P7, and P9. The figure F8 is a triangle having feature points P6, P8, and P10 as vertices.
 図形F1C~F8Cと特徴点P1C~P10Cとの関係も同様であるため説明を省略する。 Since the relationship between the figures F1C to F8C and the feature points P1C to P10C is the same, the description thereof is omitted.
 このように、図2の面積算出部26は、校正用マーカM1の画像において、概ね第1方向(上下方向)yに並ぶ1組の図形F1,F2の面積と、概ね第2方向(左右方向)xに並ぶ1組の図形F3,F4の面積と、を算出する。また、面積算出部26は、校正用マーカM2の画像において、概ね第1方向yに並ぶ1組の図形F5,F6の面積と、概ね第2方向xに並ぶ1組の図形F7,F8の面積と、を算出する。第2方向xは第1方向yに交差する。 As described above, the area calculation unit 26 in FIG. 2 includes the area of the pair of figures F1 and F2 arranged in the first direction (up and down direction) y and the second direction (left and right direction) in the image of the calibration marker M1. ) The area of a set of figures F3 and F4 arranged in x is calculated. Further, the area calculation unit 26 in the image of the calibration marker M2 has an area of a set of figures F5 and F6 arranged in the first direction y and an area of a set of figures F7 and F8 arranged in the second direction x. And are calculated. The second direction x intersects the first direction y.
 ここで、校正用マーカM1,M2は、前述のように予め定められた位置に設置されるため、車両C1を基準とした基準座標系(世界座標系)における校正用マーカM1,M2上の各特徴点の座標は、既知である。基準座標系における既知の特徴点の座標とカメラパラメータとを公知の方法により演算することにより、画像上の特徴点の座標が計算できる。ここでは、初期値のカメラパラメータを用いて計算された画像上の特徴点の座標は、図3Aの理想的な場合の特徴点P1C~P10Cの座標と等しいとする。 Here, since the calibration markers M1 and M2 are installed at predetermined positions as described above, each of the calibration markers M1 and M2 in the reference coordinate system (world coordinate system) with reference to the vehicle C1 is used. The coordinates of the feature points are known. By calculating the coordinates of the known feature points in the reference coordinate system and the camera parameters by a known method, the coordinates of the feature points on the image can be calculated. Here, it is assumed that the coordinates of the feature points on the image calculated using the initial camera parameters are equal to the coordinates of the ideal feature points P1C to P10C in FIG. 3A.
 カメラパラメータが最適値であれば、計算された特徴点P1C~P10Cの各座標は、撮像された画像から抽出された対応する特徴点P1~P10の座標と等しくなる。カメラパラメータが最適値から異なるほど、特徴点P1C~P10Cと対応する特徴点P1~P10の座標は差が大きくなる。そこで、図2の調整部28は、評価値が小さくなるようにカメラパラメータを繰り返し調整し、カメラパラメータを最適値に近づける。評価値は、第1評価値と第2評価値の和である。 If the camera parameter is an optimal value, the calculated coordinates of the feature points P1C to P10C are equal to the coordinates of the corresponding feature points P1 to P10 extracted from the captured image. As the camera parameter differs from the optimum value, the difference between the coordinates of the feature points P1 to P10 corresponding to the feature points P1C to P10C increases. Therefore, the adjustment unit 28 in FIG. 2 repeatedly adjusts the camera parameter so that the evaluation value becomes small, and brings the camera parameter close to the optimum value. The evaluation value is the sum of the first evaluation value and the second evaluation value.
 第1評価値は、図4Aに示す距離L1~L10の総和である。距離L[i](iは1~10の整数)は、計算された特徴点P[i]Cの座標と、撮像された画像から抽出された特徴点P[i]の座標との距離である。なお、図4Aにおいては、距離L1とL6は図示を省略している。 The first evaluation value is the sum of the distances L1 to L10 shown in FIG. 4A. The distance L [i] (i is an integer of 1 to 10) is a distance between the calculated coordinates of the feature point P [i] C and the coordinates of the feature point P [i] extracted from the captured image. is there. In FIG. 4A, the distances L1 and L6 are not shown.
 第2評価値は、面積差S1~S8の総和である。面積差S[j](jは1~8の整数)は、計算された特徴点P1C~P10Cから算出された図形F[j]Cの面積と、画像から抽出された特徴点P1~P10から算出された図形F[j]の面積との差である。 The second evaluation value is the sum of the area differences S1 to S8. The area difference S [j] (j is an integer of 1 to 8) is calculated from the area of the figure F [j] C calculated from the calculated feature points P1C to P10C and the feature points P1 to P10 extracted from the image. It is the difference from the calculated area of the figure F [j].
 評価値が小さくなるようにカメラパラメータを調整する方法としては、最急降下法などの公知の方法を用いることができる。図2の調整部28は、例えば、評価値がほぼ一定になった場合、または、評価値が所定のしきい値より小さくなった場合、評価値が収束したと判定してカメラパラメータの調整を終了する。 As a method for adjusting the camera parameters so that the evaluation value becomes small, a known method such as the steepest descent method can be used. 2 adjusts the camera parameters by determining that the evaluation value has converged, for example, when the evaluation value becomes substantially constant or when the evaluation value becomes smaller than a predetermined threshold value. finish.
 例えば、図4Aの計算された特徴点P2Cと抽出された特徴点P2との距離L2と比較して、図4Bの図形F3Cの面積と図形F3の面積との面積差S3は、数値として大きい。よって、特徴点のみを用いる場合と比較して、繰り返し演算における変化を大きく捉えることができ、校正の精度を向上できる。 For example, compared with the distance L2 between the calculated feature point P2C and the extracted feature point P2 in FIG. 4A, the area difference S3 between the area of the graphic F3C and the area of the graphic F3 in FIG. Therefore, compared with the case where only the feature points are used, a change in the repetitive calculation can be grasped greatly, and the calibration accuracy can be improved.
 また、図形F3Cの面積と図形F3の面積との面積差S3と比較して、図形F1Cの面積と図形F1の面積との面積差S1は小さい。図形F4Cの面積と図形F4の面積との面積差S4と比較して、図形F2Cの面積と図形F2の面積との面積差S2は小さい。即ち、カメラ10の取り付け位置のヨー角の誤差により、図形F1~F8の面積はそれぞれ異なる傾向で変化している。そのため、面積差S1~S8の総和が小さくなるようにカメラパラメータを調整することにより、カメラ10の取り付け位置のヨー角の誤差をより精度良く校正できる。 Also, the area difference S1 between the area of the figure F1C and the area of the figure F1 is smaller than the area difference S3 between the area of the figure F3C and the area of the figure F3. Compared to the area difference S4 between the area of the figure F4C and the area of the figure F4, the area difference S2 between the area of the figure F2C and the area of the figure F2 is small. That is, the areas of the figures F1 to F8 change with different tendencies due to the error of the yaw angle at the mounting position of the camera 10. Therefore, by adjusting the camera parameters so that the sum of the area differences S1 to S8 becomes small, the error of the yaw angle at the mounting position of the camera 10 can be calibrated with higher accuracy.
 同様に、カメラ10の取り付け位置のピッチ角およびロール角の誤差が存在する場合にも、図形F1~F8の面積はそれぞれ異なる傾向で変化するので、より精度良く校正できる。 Similarly, even when there is an error in the pitch angle and roll angle of the mounting position of the camera 10, the areas of the figures F1 to F8 change with different tendencies, so that calibration can be performed with higher accuracy.
 図5は、図1Aのカメラ校正装置20の処理を示すフローチャートである。まず、特徴点抽出部24は、カメラ10で撮像された校正用マーカM1,M2の画像から複数の特徴点P1~P10を抽出する(S1)。次に、面積算出部26は、抽出された特徴点P1~P10によって規定される図形F1~F8の面積を算出する(S2)。次に、調整部28は、抽出された特徴点P1~P10の座標と、算出された図形F1~F8の面積とに基づいて、カメラパラメータを調整する(S3)。次に、調整部28は、評価値が収束したか判定し(S4)、収束していない場合(S4のN)、S3の処理に戻る。調整部28は、評価値が収束した場合(S4のY)、処理を終了する。 FIG. 5 is a flowchart showing processing of the camera calibration device 20 of FIG. 1A. First, the feature point extraction unit 24 extracts a plurality of feature points P1 to P10 from the images of the calibration markers M1 and M2 captured by the camera 10 (S1). Next, the area calculation unit 26 calculates the areas of the graphics F1 to F8 defined by the extracted feature points P1 to P10 (S2). Next, the adjusting unit 28 adjusts the camera parameters based on the coordinates of the extracted feature points P1 to P10 and the calculated areas of the graphics F1 to F8 (S3). Next, the adjustment unit 28 determines whether or not the evaluation value has converged (S4). If the evaluation value has not converged (N in S4), the process returns to S3. When the evaluation value has converged (Y in S4), the adjustment unit 28 ends the process.
 調整されたカメラパラメータは、車両C1内の図示しない記憶部に記憶される。そして、車両C1内の図示しない画像処理装置は、記憶されたカメラパラメータを用いて、カメラ10により撮像された画像を補正し、カメラ10の取り付け位置の誤差などによる歪みが補正された画像を生成する。補正された画像は、ドライバーが車両C1の後方を確認するためなどに用いられる。 The adjusted camera parameters are stored in a storage unit (not shown) in the vehicle C1. Then, an image processing device (not shown) in the vehicle C1 corrects the image captured by the camera 10 using the stored camera parameters, and generates an image in which distortion due to an error in the mounting position of the camera 10 is corrected. To do. The corrected image is used for the driver to confirm the rear of the vehicle C1.
 このように、本実施形態によれば、特徴点の座標のみを用いる場合と比較して、カメラパラメータの調整に用いられる特徴量が増える。その上、カメラ10の取り付け位置の誤差に応じて、面積は特徴点とは異なる変化をする。また、カメラ10の取り付け位置の誤差による面積の変化量は、カメラ10の取り付け位置の誤差による特徴点の座標の変化量と比べて大きい。よって、特徴点のみを用いる場合と比較して、校正の精度を向上できる。 Thus, according to the present embodiment, the feature amount used for adjusting the camera parameter is increased as compared with the case where only the coordinates of the feature point are used. In addition, the area changes differently from the feature points according to the error in the mounting position of the camera 10. Further, the amount of change in the area due to the error in the mounting position of the camera 10 is larger than the amount of change in the coordinates of the feature points due to the error in the mounting position of the camera 10. Therefore, the accuracy of calibration can be improved as compared with the case where only feature points are used.
 従って、特徴点のみを用いる場合と比較して、校正の精度を低下させることなく、より小型の校正用マーカを用いることもできる。 Therefore, it is possible to use a smaller calibration marker without degrading the calibration accuracy as compared with the case where only the feature points are used.
 また、カメラ10のピッチ角と、ヨー角と、ロール角の少なくとも何れかのずれにより、第1方向yに並ぶ1組の図形F1,F2の面積と、第2方向xに並ぶ1組の図形F3,F4の面積とは、それぞれ異なる傾向で変化する。同様に、1組の図形F5,F6の面積と、1組の図形F7,F8の面積も、それぞれ異なる傾向で変化する。そのため、カメラ10の取り付け位置のピッチ角、ヨー角およびロール角の誤差の影響をより精度良く校正できる。 The area of the set of figures F1 and F2 arranged in the first direction y and the set of figures arranged in the second direction x due to at least one of the pitch angle, the yaw angle, and the roll angle of the camera 10 are shifted. The areas of F3 and F4 change with different tendencies. Similarly, the area of the set of figures F5 and F6 and the area of the set of figures F7 and F8 also change with different tendencies. Therefore, it is possible to calibrate the influence of errors in the pitch angle, yaw angle, and roll angle at the mounting position of the camera 10 more accurately.
 以上、本開示について、実施形態をもとに説明した。この実施形態は例示であり、それらの各構成要素あるいは各処理プロセスの組合せにいろいろな変形例が可能なこと、また、そうした変形例も本開示の範囲にあることは当業者に理解されるところである。 The present disclosure has been described based on the embodiments. This embodiment is an exemplification, and it is understood by those skilled in the art that various modifications can be made to each of those constituent elements or combinations of processing processes, and such modifications are also within the scope of the present disclosure. is there.
 例えば、1つの校正用マーカを用いてもよい。この場合、上述の実施形態よりも校正精度は低下するが、校正用マーカの設置スペースを削減できる。 For example, one calibration marker may be used. In this case, although the calibration accuracy is lower than that in the above-described embodiment, the installation space for the calibration marker can be reduced.
 また、3つ以上の校正用マーカを用いてもよい。この場合、上述の実施形態よりも校正精度は向上するが、校正用マーカの設置スペースは増加する。 Also, three or more calibration markers may be used. In this case, although the calibration accuracy is improved as compared with the above-described embodiment, the installation space for the calibration marker is increased.
 また、1つの校正用マーカにおいて少なくとも1つの図形を規定する3つ以上の特徴点を抽出可能であれば、校正用マーカの具体的な模様は特に限定されない。1つの図形の面積に基づいてカメラパラメータを調整する場合、上述の実施形態よりも校正精度は低下するが、カメラパラメータの調整時間をより短くすることができる場合もある。 Further, the specific pattern of the calibration marker is not particularly limited as long as three or more feature points that define at least one figure can be extracted from one calibration marker. When the camera parameter is adjusted based on the area of one figure, the calibration accuracy is lower than in the above-described embodiment, but the camera parameter adjustment time may be shorter.
 また、図形は、三角形以外の多角形であってもよく、それぞれ異なる形状であってもよい。 In addition, the figure may be a polygon other than a triangle, or may have a different shape.
 さらに、第1評価値は、図4Aに示す距離L1~L10に基づく値であればよく、距離L1~L10の平均であってもよい。第2評価値は、面積差S1~S8に基づく値であればよく、面積差S1~S8の平均であってもよい。 Furthermore, the first evaluation value may be a value based on the distances L1 to L10 shown in FIG. 4A, and may be an average of the distances L1 to L10. The second evaluation value may be a value based on the area differences S1 to S8, and may be an average of the area differences S1 to S8.
 さらに、カメラ10が取り付けられる位置は、車両C1の前部または側部などでもよい。また、カメラ校正装置20により、車両C1以外の装置に取り付けられたカメラ10を校正してもよい。 Furthermore, the position where the camera 10 is attached may be the front part or the side part of the vehicle C1. Further, the camera 10 attached to a device other than the vehicle C1 may be calibrated by the camera calibration device 20.
 本開示の一態様は、次の通りである。 One aspect of the present disclosure is as follows.
 [項目1]
 カメラで撮像された校正用マーカの画像から複数の特徴点を抽出する特徴点抽出部と、
 前記特徴点抽出部で抽出された前記特徴点によって規定される図形の面積を算出する面積算出部と、
 前記特徴点抽出部で抽出された前記特徴点の座標と、前記面積算出部で算出された前記図形の面積とに基づいて、前記カメラを校正するためのカメラパラメータを調整する調整部と、
 を備える、カメラ校正装置。
[Item 1]
A feature point extraction unit that extracts a plurality of feature points from a calibration marker image captured by a camera;
An area calculation unit for calculating an area of a figure defined by the feature points extracted by the feature point extraction unit;
An adjustment unit that adjusts camera parameters for calibrating the camera based on the coordinates of the feature points extracted by the feature point extraction unit and the area of the figure calculated by the area calculation unit;
A camera calibration device.
 この態様によると、カメラの取り付け位置の誤差に応じて面積は特徴点とは異なる変化をするとともに、カメラの取り付け位置の誤差による面積の変化量は、カメラの取り付け位置の誤差による特徴点の座標の変化量と比べて大きいので、校正の精度を向上できる。 According to this aspect, the area changes differently from the feature point according to the camera mounting position error, and the amount of change in the area due to the camera mounting position error is the coordinate of the feature point due to the camera mounting position error. Since it is larger than the amount of change, the accuracy of calibration can be improved.
 [項目2]
 前記面積算出部は、第1方向に並ぶ1組の図形の面積と、前記第1方向に交差する第2方向に並ぶ1組の前記図形の面積と、を算出する、項目1に記載のカメラ校正装置。
[Item 2]
The camera according to item 1, wherein the area calculation unit calculates an area of a set of figures arranged in a first direction and an area of a set of figures arranged in a second direction intersecting the first direction. Calibration device.
 この場合、カメラのピッチ角と、ヨー角と、ロール角の少なくとも何れかのずれにより、第1方向に並ぶ1組の図形の面積と、第2方向に並ぶ1組の図形の面積とは、それぞれ異なる傾向で変化するので、カメラの取り付け位置の誤差の影響をより精度良く校正できる。 In this case, the area of the set of figures arranged in the first direction and the area of the set of figures arranged in the second direction due to the shift of at least one of the pitch angle, the yaw angle, and the roll angle of the camera, Since they change with different tendencies, the effects of camera mounting position errors can be calibrated with higher accuracy.
 [項目3]
 カメラで撮像された校正用マーカの画像から複数の特徴点を抽出することと、
 抽出された前記特徴点によって規定される図形の面積を算出することと、
 抽出された前記特徴点の座標と、算出された前記図形の面積とに基づいて、前記カメラを校正するためのカメラパラメータを調整することと、
 を備える、カメラ校正方法。
[Item 3]
Extracting a plurality of feature points from a calibration marker image captured by a camera;
Calculating the area of the figure defined by the extracted feature points;
Adjusting camera parameters for calibrating the camera based on the coordinates of the extracted feature points and the calculated area of the figure;
A camera calibration method comprising:
 この態様によると、カメラの取り付け位置の誤差に応じて面積は特徴点とは異なる変化をするとともに、カメラの取り付け位置の誤差による面積の変化量は、カメラの取り付け位置の誤差による特徴点の座標の変化量と比べて大きいので、校正の精度を向上できる。 According to this aspect, the area changes differently from the feature point according to the camera mounting position error, and the amount of change in the area due to the camera mounting position error is the coordinate of the feature point due to the camera mounting position error. Since it is larger than the amount of change, the accuracy of calibration can be improved.
 本開示にかかるカメラ校正装置およびカメラ校正方法によれば、カメラの校正精度を向上できるので、自動車等の移動体に搭載されるカメラの校正に適用できる。 The camera calibration apparatus and the camera calibration method according to the present disclosure can improve the calibration accuracy of the camera, and thus can be applied to the calibration of a camera mounted on a moving body such as an automobile.
10 カメラ
20 カメラ校正装置
24 特徴点抽出部
26 面積算出部
28 調整部
DESCRIPTION OF SYMBOLS 10 Camera 20 Camera calibration apparatus 24 Feature point extraction part 26 Area calculation part 28 Adjustment part

Claims (3)

  1.  カメラで撮像された校正用マーカの画像から複数の特徴点を抽出する特徴点抽出部と、
     前記特徴点抽出部で抽出された前記特徴点によって規定される図形の面積を算出する面積算出部と、
     前記特徴点抽出部で抽出された前記特徴点の座標と、前記面積算出部で算出された前記図形の面積とに基づいて、前記カメラを校正するためのカメラパラメータを調整する調整部と、
     を備える、カメラ校正装置。
    A feature point extraction unit that extracts a plurality of feature points from a calibration marker image captured by a camera;
    An area calculation unit for calculating an area of a figure defined by the feature points extracted by the feature point extraction unit;
    An adjustment unit that adjusts camera parameters for calibrating the camera based on the coordinates of the feature points extracted by the feature point extraction unit and the area of the figure calculated by the area calculation unit;
    A camera calibration device.
  2.  前記面積算出部は、第1方向に並ぶ1組の前記図形の面積と、前記第1方向に交差する第2方向に並ぶ1組の前記図形の面積と、を算出する、請求項1に記載のカメラ校正装置。 The area calculating unit calculates an area of the set of figures arranged in the first direction and an area of the set of figures arranged in the second direction intersecting the first direction. Camera calibration device.
  3.  カメラで撮像された校正用マーカの画像から複数の特徴点を抽出することと、
     抽出された前記特徴点によって規定される図形の面積を算出することと、
     抽出された前記特徴点の座標と、算出された前記図形の面積とに基づいて、前記カメラを校正するためのカメラパラメータを調整することと、
     を備える、カメラ校正方法。
    Extracting a plurality of feature points from a calibration marker image captured by a camera;
    Calculating the area of the figure defined by the extracted feature points;
    Adjusting camera parameters for calibrating the camera based on the coordinates of the extracted feature points and the calculated area of the figure;
    A camera calibration method comprising:
PCT/JP2017/001637 2016-01-29 2017-01-19 Camera calibration device and camera calibration method WO2017130823A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201780007829.6A CN108496354B (en) 2016-01-29 2017-01-19 Camera calibration device and camera calibration method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016016382A JP6688989B2 (en) 2016-01-29 2016-01-29 Camera calibration device and camera calibration method
JP2016-016382 2016-01-29

Publications (1)

Publication Number Publication Date
WO2017130823A1 true WO2017130823A1 (en) 2017-08-03

Family

ID=59397764

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/001637 WO2017130823A1 (en) 2016-01-29 2017-01-19 Camera calibration device and camera calibration method

Country Status (3)

Country Link
JP (1) JP6688989B2 (en)
CN (1) CN108496354B (en)
WO (1) WO2017130823A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3742192B1 (en) * 2018-09-28 2023-09-06 NEXION S.p.A. System for calibrating a vehicle camera

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10504244B2 (en) * 2017-09-28 2019-12-10 Baidu Usa Llc Systems and methods to improve camera intrinsic parameter calibration
CN109272474B (en) * 2018-11-21 2022-04-12 大陆汽车车身电子系统(芜湖)有限公司 Method for determining correction parameters of imaging system and pre-correction method of imaging system
CN110827357B (en) * 2019-09-30 2024-03-29 深圳市安思疆科技有限公司 Combined pattern calibration plate and structured light camera parameter calibration method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08110206A (en) * 1994-10-12 1996-04-30 Ricoh Co Ltd Method and apparatus for detecting position and posture

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003078811A (en) * 2001-09-04 2003-03-14 Nippon Hoso Kyokai <Nhk> Method for associating marker coordinate, method and system for acquiring camera parameter and calibration pattern
CN100557635C (en) * 2008-06-10 2009-11-04 北京航空航天大学 A kind of camera marking method based on flexible stereo target
JP4751939B2 (en) * 2009-03-31 2011-08-17 アイシン精機株式会社 Car camera calibration system
JP5091902B2 (en) * 2009-03-31 2012-12-05 アイシン精機株式会社 Calibration index used for calibration of in-vehicle camera, in-vehicle camera calibration method and system using the calibration index, and program for the system
CN103854271B (en) * 2012-11-28 2016-08-31 华中科技大学 A kind of planar pickup machine scaling method
CN104008548B (en) * 2014-06-04 2017-04-19 无锡维森智能传感技术有限公司 Feature point extraction method for vehicle-mounted around view system camera parameter calibration
CN104217429A (en) * 2014-08-25 2014-12-17 太仓中科信息技术研究院 Design and detection method of camera calibration board

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08110206A (en) * 1994-10-12 1996-04-30 Ricoh Co Ltd Method and apparatus for detecting position and posture

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3742192B1 (en) * 2018-09-28 2023-09-06 NEXION S.p.A. System for calibrating a vehicle camera

Also Published As

Publication number Publication date
CN108496354A (en) 2018-09-04
CN108496354B (en) 2020-12-08
JP2017135680A (en) 2017-08-03
JP6688989B2 (en) 2020-04-28

Similar Documents

Publication Publication Date Title
WO2017130823A1 (en) Camera calibration device and camera calibration method
CN108292439B (en) Method and storage medium for calibrating orientation of camera mounted to vehicle
US10192309B2 (en) Camera calibration device
US9361687B2 (en) Apparatus and method for detecting posture of camera mounted on vehicle
US20190141313A1 (en) Calibration method, calibration device, and computer program product
US11199397B2 (en) Distance measurement using a longitudinal grid pattern
JP6348093B2 (en) Image processing apparatus and method for detecting image of detection object from input data
JP5278728B2 (en) Distance image sensor calibration apparatus and calibration method
KR102647929B1 (en) Apparatus and method for calibrating camera and lidar sensor of vehicle
US20170343650A1 (en) Radar apparatus mounted on moving body and azimuth angle correction method for use in radar apparatus mounted on moving body
JP2009288152A (en) Calibration method of on-vehicle camera
CN106846412B (en) Checkerboard angular point detection method and device
US10672146B2 (en) Calibration apparatus for onboard camera and calibration method for onboard camera
CN112288822A (en) Camera active alignment method combined with calibration
Maroli et al. Automated rotational calibration of multiple 3D LIDAR units for intelligent vehicles
CN106488204B (en) Have the depth camera of self-aligning and self-aligning method
CN114332237A (en) Method for calculating conversion relation between camera coordinate system and laser coordinate system
JP2020107938A (en) Camera calibration device, camera calibration method, and program
JP2018125706A (en) Imaging apparatus
US10126646B2 (en) Method of calculating a shift value of a cell contact
JP2017062150A (en) Device for calibrating stereo camera and method for calibrating stereo camera
JP6156212B2 (en) Object detection device
JP7318521B2 (en) Estimation device, estimation method, estimation program
JP7069943B2 (en) Direction detection system, direction detection method, and direction detection program
KR102238789B1 (en) Calibration method for triple camera-module

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17744053

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17744053

Country of ref document: EP

Kind code of ref document: A1