WO2018130605A1 - Method for calibrating a camera for a motor vehicle considering a calibration error, camera as well as motor vehicle - Google Patents

Method for calibrating a camera for a motor vehicle considering a calibration error, camera as well as motor vehicle Download PDF

Info

Publication number
WO2018130605A1
WO2018130605A1 PCT/EP2018/050637 EP2018050637W WO2018130605A1 WO 2018130605 A1 WO2018130605 A1 WO 2018130605A1 EP 2018050637 W EP2018050637 W EP 2018050637W WO 2018130605 A1 WO2018130605 A1 WO 2018130605A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
calibration
point
determined
calibration object
Prior art date
Application number
PCT/EP2018/050637
Other languages
French (fr)
Inventor
Miguel Fernandez
Original Assignee
Connaught Electronics Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Connaught Electronics Ltd. filed Critical Connaught Electronics Ltd.
Publication of WO2018130605A1 publication Critical patent/WO2018130605A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present invention relates to a method for calibrating a camera for a motor vehicle, in which a calibration object is positioned in an environment of the camera, an image of the calibration object is captured by the camera, the camera is calibrated depending on the captured image and a calibration error is determined for evaluation of the calibration, wherein for determining the calibration error, a point of a reference point of the calibration object imaged to an image sensor of the camera is determined. Further, the present invention relates to a camera for a motor vehicle. Finally, the present invention relates to a motor vehicle with at least one such camera.
  • the interest is directed to cameras for motor vehicles.
  • Such cameras can for example be a part of a driver assistance system or camera system, which serves to assist a driver of the motor vehicle in driving the motor vehicle.
  • images of an environmental region of the motor vehicle can be captured by the camera, which are then presented to the driver on a display device.
  • objects in the environment of the motor vehicle are detected based on the images with the aid of corresponding object detection algorithms.
  • a calibration object can be positioned in the environment of the camera for calibrating the camera.
  • an imaged point can be determined, which describes the imaging of a reference point of the calibration object to the image sensor of the camera.
  • a projected point can also be determined, which describes the projection of the reference point of the calibration object to the image sensor.
  • the calibration of the camera can then be assessed or performed. This is also referred to as the back projection error.
  • the calibration is performed within the image space.
  • this object is solved by a method, by a camera as well as by a motor vehicle having the features according to the respective independent claims.
  • a calibration object is preferably positioned in an environment of the camera. Moreover, an image of the calibration object is preferably captured by the camera. Furthermore, the camera is in particular calibrated depending on the captured image. Further, a calibration error is in particular determined for evaluation of the calibration, wherein a point of a reference point of the calibration object imaged by an image sensor of the camera is preferably determined for determining the calibration error. In addition, it is preferably provided that a back projected point describing a back projection of the imaged point into the environment of the camera is determined. Furthermore, the calibration error is preferably determined based on a deviation between the reference point of the calibration object and the back projected point.
  • a method according to the invention serves for calibrating a camera for a motor vehicle.
  • a calibration object is positioned in an environment of the camera. Further, an image of the calibration object is captured by the camera. In addition, the camera is calibrated depending on the captured image.
  • a calibration error is determined, wherein a point of a reference point of the calibration object imaged to an image sensor of the camera is determined for determining the calibration error.
  • a back projected point describing a back projection of the imaged point into the environment of the camera is determined and the calibration error is determined based on a deviation between the reference point of the calibration object and the back projected point.
  • a camera for a motor vehicle is to be calibrated.
  • the calibration of the camera is to be evaluated.
  • the camera is adapted for use at a motor vehicle.
  • the camera can be a part of a camera system or a driver assistance system, which serves for assisting a driver of the motor vehicle in driving the motor vehicle.
  • images of an environmental region of the motor vehicle can for example be provided. These images can be presented to the driver on a display device. It can also be provided that objects in the environment of the motor vehicle are detected based on the images provided by the camera. Before this camera is mounted on the motor vehicle, it is required to calibrate the camera.
  • the calibration object For calibration of the camera, the calibration object, the spatial dimensions of which can be known, is positioned relative to the camera.
  • the calibration object is positioned to the camera in such a way that the calibration object can be captured by the camera.
  • the calibration object can be arranged in a predetermined position.
  • the calibration object has at least one reference point. At least one image is captured of this calibration object with the aid of the camera. Based on this image, the calibration or the current setting of the camera can then be examined. Therein, intrinsic parameters and/or extrinsic parameters of the camera can be adapted to perform the calibration. After the calibration of the camera has been performed, it is required to evaluate them. Thereto, a calibration error is determined. For determining the calibration error, the imaged point on the image sensor is determined, which is the projection of the reference point of the calibration object.
  • the back projected point is determined.
  • This back projected point describes the back projection of the imaged point on the image sensor into the environment of the camera. Thus, it is determined, where in the environment of the camera the imaged point is actually located.
  • the back projected point describes the back projection of the imaged point from the image sensor into the real world. In particular, the back projected point does not describe the projection of a three-dimensional point into a two-dimensional image plane.
  • the calibration error the deviation between the reference point of the calibration object in the environment of the camera and the back projected point in the environment of the camera is then determined. If the imaged point is a two-dimensional point, the projection of this imaged point into the real three-dimensional world is a line.
  • extrinsic parameters can be used. These extrinsic parameters can, in particular, describe the position of the calibration object and/or the reference point.
  • the calibration error is not determined within the image plane or on the image sensor, but in the real world. This is based on the realization that the determination of the calibration error in the image plane makes it impossible to set a suitable calibration error threshold, which can be used for multiple different cameras.
  • a calibration error which is determined within the image plane, either cannot be used as a comparative measure for different camera types or different environmental situations. Since the calibration error is presently determined in the real world, the relative location of the calibration object to the camera can also be taken into account. Further, it is allowed determining the calibration error by a single camera. Thus, the calibration can be overall evaluated in simple and reliable manner. In addition, the calibration of different cameras can be compared with each other.
  • a respective back projected point is determined for a plurality of reference points of the calibration object, and the calibration error is determined on the basis of the respective deviation between the reference point and the back project point.
  • a plurality of reference points of the calibration object can be used.
  • the back projected point can then be determined from the imaged point.
  • a single calibration error can then be determined on the basis of the respective deviation between the reference point and the corresponding back projected point.
  • the total calibration error can be determined from the individual calibration errors.
  • the least square method can be used.
  • the number of reference points depends on the model of the camera to be calibrated. In particular, the number of reference points depends on the intrinsic parameters of the camera.
  • the number of reference points depends on the geometry of the calibration object. If in the simplest case a pinhole image camera and a planar calibration object are used, a homography with eight degrees of freedom is obtained. For this purpose, four non-collinear reference points are necessary. If the camera is a camera with a fisheye lens, more reference points are required.
  • a distance between the reference point and a projection center is determined and the back projected point is determined such that a distance between the back projected point and the projection center and the distance between the reference point and the projection center are identical.
  • a projection center can be determined. Since the position of the reference point of the calibration object is known, the distance between the reference point and the projection point can also be determined.
  • the reference point of the calibration object and the back projected point have the same distance to the projection center.
  • the back projected point describes a projection of the imaged point with respect to the projection center.
  • the back projected point can be determined in simple and reliable manner.
  • the calibration error is normalized based on the distance between the reference point and the projection center. As soon as the distance between the reference point of the calibration object and the projection center is known, it can be used to normalize or weight the calibration error. Thus, a calibration error can be used for evaluating the calibration, which is independent of the distance of the reference point of the calibration object to the projection center. Thereby, the calibration error also becomes independent of the distance of the calibration object to the camera. This allows using the calibration error also for different arrangements of cameras to calibration objects.
  • a projected point which describes a projection of the reference point of the calibration object to the image sensor by means of a central projection.
  • the central projection can be used to describe the imaging of the calibration object to the image sensor of the camera.
  • the projected point on the image sensor can be determined.
  • the determination of the projected point can be carried out with the aid of an image processing method, for example a method of corner detection.
  • This projected point describes the imaging of the reference point of the calibration object to the image sensor considering the projection center.
  • a model can be provided in simple and reliable manner, which describes the imaging of the calibration object by means of the camera.
  • the deviation is determined based on an angle between a first line connecting the projection center and the reference point of the calibration object and a second line connecting the projection center to the back projected point.
  • the first line describes the projection of the reference point of the calibration object on the projected point on the image sensor.
  • the second line describes the projection of the imaged point on the image sensor to the back projected point in the environment of the camera.
  • the first line and the second line extend through the projection center.
  • the angle between the first line and the second line now describes the deviation between the back projected point and the reference point of the calibration object.
  • the deviation between these points and therefore also the calibration error can be determined based on the angle in simple manner.
  • the deviation is determined based on a distance between the reference point of the calibration object and the back projected point.
  • the Euclidean distance between the reference point of the calibration object and the back projected point can in particular be determined. In particular, this can be performed in simple manner if the distance between the back projected point and the projection center and the distance between the projection center and the reference point of the calibration object, respectively, as well as the angle between the first and the second line are known.
  • the deviation is determined based on an arc length of an arc, which extends through the reference point of the calibration object and the back projected point and the center point of which is the projection center.
  • the arc can in particular be a circular arc. This arc length can be determined in simple manner if the angle between the first line and the second line is known.
  • the calibration error is compared to a threshold value, wherein the threshold value is determined depending on a distance between the reference point of the calibration object and the camera.
  • the threshold value is determined depending on a distance between the reference point of the calibration object and the camera.
  • it can for example be preset, which deviations between the reference point of the calibration object and the back projected point can be tolerated depending on the distance of the calibration object to the camera.
  • the distance between the reference point of the calibration object and the back projected point in the real world can be preset. Such a distance is therefore conceivable in simple manner.
  • intrinsic parameters and/or extrinsic parameters of the camera are adapted in calibrating the camera.
  • the extrinsic camera parameters describe the positions and/or orientations of the camera in space.
  • the intrinsic parameters for example describe the focal length of the camera.
  • the values of an image center point represent intrinsic parameters.
  • the pixel scaling in different spatial directions also represents intrinsic parameters.
  • a camera according to the invention for a motor vehicle is calibrated according to a method according to the invention.
  • the camera can have a motor vehicle fixing device for fixing the camera to the motor vehicle.
  • the camera can constitute a part of a camera system or driver assistance system of the motor vehicle.
  • the camera has a fish-eye objective or a fish-eye lens.
  • the use of the calibration error defined within the real world is suitable.
  • a motor vehicle according to the invention includes at least one camera according to the invention.
  • the motor vehicle is in particular formed as a passenger car.
  • the preferred embodiments presented with respect to the method according to the invention and the advantages thereof correspondingly apply to the camera according to the invention as well as to the motor vehicle according to the invention.
  • Fig. 1 a motor vehicle according to an embodiment of the present invention, which has a plurality of cameras;
  • Fig. 2 a schematic representation of a projection of a calibration object to an
  • Fig. 1 shows a motor vehicle 1 according to an embodiment of the present invention in a plan view.
  • the motor vehicle 1 is formed as a passenger car.
  • the motor vehicle 1 includes a camera system 2, which includes an electronic controller 3.
  • the camera system 2 includes a plurality of cameras 4.
  • the camera system 2 includes four cameras 4. Therein, one camera 4 is arranged in a front area 7 of the motor vehicle 1 , one camera 4 is arranged in a rear area 5 of the motor vehicle 1 and two cameras 4 are arranged in respective lateral areas 6 of the motor vehicle 1 .
  • the number and arrangement of the cameras 4 at the motor vehicle 1 are to be purely exemplarily understood.
  • Images can be provided by the cameras 4, which describe the environment 8 of the motor vehicle 1 or of the cameras 4.
  • the cameras 4 are connected to the electronic controller 3 for data transfer. Corresponding data lines are presently not illustrated for the sake of clarity.
  • the images provided by the cameras 4 can be evaluated with the aid of the controller 3.
  • the cameras 4 in particular have fish-eye objectives.
  • a calibration object 9 is positioned in the environment 8 of the camera 4.
  • Fig. 2 shows a schematic representation of such a calibration object 9 and a schematic representation of an image sensor 10 of the camera 4.
  • An image of the calibration object 9 is captured with the aid of the camera 4.
  • a reference point P1 of the calibration object 9 is imaged to the image sensor 10 such that an imaged point P2 results.
  • a projected point P3 is determined on the image sensor 10, which describes the projection of the reference point P1 to the image sensor 10.
  • a central projection is used, which describes a projection center C.
  • a back projected point P4 can also be determined, which describes a back projection of the imaged point P2 into the environment 8 of the camera 4.
  • a distance d between the projection center C and the reference point P1 of the calibration object 9 can be determined.
  • the back projected point P4 is determined such that it has the same distance d to the projection center C as the reference point P1 of the calibration object 9.
  • a first line G1 is shown, which connects the projection center C to the reference point P1 .
  • a second line G2 connects the projection center C to the back projected point P2.
  • An angle ⁇ describes the angle between the first line G1 and the second line G2.
  • Lines G1 and G2 are straight lines.
  • the connection between the projection center C and the back projected point P2 as well as the connection between the projection center C and the projected point P3 are also represented as straight lines in the present case. These connections may also be different from a straight line. This depends on the camera model used. Based on the deviation between the reference point P1 and the back projected point P4, a calibration error can be determined, which describes the calibration of the camera 4.
  • a distance L2 or a Euclidean distance between the three-dimensional data points P1 and P4 can be determined.
  • the distance L2 can be determined according to the following formula based on the distance d between the projection center C and the reference point P1 as well as the angle ⁇ between the first line G1 and the second line G2:
  • an arc length a is determined, which describes the distance between the reference point P1 and the back projected point P4.
  • the distance L2 and the arc length a are values, which can be intuitively and simply understood. They describe an error, which can for example be indicated in millimeters and which can be indicated for evaluating the error in the three-dimensional space. Thus, it is allowed calibrating the camera 4 for different settings and conditions. In addition, the calibration of the camera 4 can be assessed in simple manner based on the calibration error. The greater the calibration error, thus, the distance L2 or the arc length a, the worse the calibration. In addition, two cameras 4 different from each other, which for example have different capturing ranges or image sensors, can also be compared to each other.
  • a threshold value S1 , S2 can also be defined by a user, to which the calibration error can be compared.
  • This threshold value S1 , S2 can be determined depending on a distance of the calibration object 9 or the reference point P1 to the camera 4. For example, with a distance of 2 m between the camera 4 and the calibration object 9, an error of 50 mm can be accepted. With a distance of 5 m to the camera 4, for example, a deviation of 100 mm can be accepted.
  • the threshold values S1 , S2 can be defined for different distances and be compared to each other.
  • the calibration error or the distance L2 or the arc length a are normalized in order that they can be used for different camera models and different calibration settings in simple manner.
  • the normalized distance L2 n can be determined according to the following formula:
  • the threshold values too can be defined according to the same logic. Therein, a user can define a maximally acceptable distance L2 max or a maximally acceptable arc length a max depending on the distance d.
  • the threshold value S1 for the distance L2 results according to the following formula:
  • the threshold value S2 for the arc length a results according to the following formula: ⁇
  • the respective calibration error can be determined for a plurality of reference points P1 of the calibration object 9.
  • the calibration of the camera 4 can overall be evaluated in reliable manner.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to a method for calibrating a camera (4) for a motor vehicle (1), in which a calibration object (9) is positioned in an environment (8) of the camera (4), an image of the calibration object (9) is captured by the camera (4), the camera (4) is calibrated depending on the captured image and a calibration error is determined for evaluating the calibration, wherein a point (P2) of a reference point (P1) of the calibration object (9) imaged to an image sensor (10) of the camera (4) is determined for determining the calibration error, wherein a back projected point (P4), which describes a back projection of the imaged point (P2) into the environment (8) of the camera (4), is determined and the calibration error is determined based on a deviation between the reference point (P1) of the calibration object (9) and the back projected point (P4).

Description

Method for calibrating a camera for a motor vehicle considering a calibration error, camera as well as motor vehicle
The present invention relates to a method for calibrating a camera for a motor vehicle, in which a calibration object is positioned in an environment of the camera, an image of the calibration object is captured by the camera, the camera is calibrated depending on the captured image and a calibration error is determined for evaluation of the calibration, wherein for determining the calibration error, a point of a reference point of the calibration object imaged to an image sensor of the camera is determined. Further, the present invention relates to a camera for a motor vehicle. Finally, the present invention relates to a motor vehicle with at least one such camera.
Presently, the interest is directed to cameras for motor vehicles. Such cameras can for example be a part of a driver assistance system or camera system, which serves to assist a driver of the motor vehicle in driving the motor vehicle. For example, images of an environmental region of the motor vehicle can be captured by the camera, which are then presented to the driver on a display device. It can also be provided that objects in the environment of the motor vehicle are detected based on the images with the aid of corresponding object detection algorithms. To allow reliable operation of the driver assistance system or of the camera system, it is required that the cameras are
correspondingly calibrated.
For calibrating cameras, different methods are known from the prior art. For example, a calibration object can be positioned in the environment of the camera for calibrating the camera. Therein, an imaged point can be determined, which describes the imaging of a reference point of the calibration object to the image sensor of the camera. Based on the known relative location of the calibration object to the camera or the image sensor, a projected point can also be determined, which describes the projection of the reference point of the calibration object to the image sensor. Based on a deviation between the imaged point and the back projected point, the calibration of the camera can then be assessed or performed. This is also referred to as the back projection error. In this known method, the calibration is performed within the image space. Further, methods are known from the prior art, in which a calibration object or the reference point of the calibration object is captured by two or more cameras to be able to evaluate the calibration. It is the object of the present invention to demonstrate a solution, how a calibration of a camera for a motor vehicle can be carried more comparable and reliably.
According to the invention, this object is solved by a method, by a camera as well as by a motor vehicle having the features according to the respective independent claims.
Advantageous developments of the present invention are the subject matter of the dependent claims.
According to an embodiment of a method for calibrating a camera for a motor vehicle, a calibration object is preferably positioned in an environment of the camera. Moreover, an image of the calibration object is preferably captured by the camera. Furthermore, the camera is in particular calibrated depending on the captured image. Further, a calibration error is in particular determined for evaluation of the calibration, wherein a point of a reference point of the calibration object imaged by an image sensor of the camera is preferably determined for determining the calibration error. In addition, it is preferably provided that a back projected point describing a back projection of the imaged point into the environment of the camera is determined. Furthermore, the calibration error is preferably determined based on a deviation between the reference point of the calibration object and the back projected point.
A method according to the invention serves for calibrating a camera for a motor vehicle. Herein, a calibration object is positioned in an environment of the camera. Further, an image of the calibration object is captured by the camera. In addition, the camera is calibrated depending on the captured image. For evaluating the calibration, a calibration error is determined, wherein a point of a reference point of the calibration object imaged to an image sensor of the camera is determined for determining the calibration error.
Moreover, a back projected point describing a back projection of the imaged point into the environment of the camera is determined and the calibration error is determined based on a deviation between the reference point of the calibration object and the back projected point.
With the aid of the method, a camera for a motor vehicle is to be calibrated. In particular, the calibration of the camera is to be evaluated. The camera is adapted for use at a motor vehicle. For example, the camera can be a part of a camera system or a driver assistance system, which serves for assisting a driver of the motor vehicle in driving the motor vehicle. With the aid of the camera, images of an environmental region of the motor vehicle can for example be provided. These images can be presented to the driver on a display device. It can also be provided that objects in the environment of the motor vehicle are detected based on the images provided by the camera. Before this camera is mounted on the motor vehicle, it is required to calibrate the camera. For calibration of the camera, the calibration object, the spatial dimensions of which can be known, is positioned relative to the camera. The calibration object is positioned to the camera in such a way that the calibration object can be captured by the camera. The calibration object can be arranged in a predetermined position. The calibration object has at least one reference point. At least one image is captured of this calibration object with the aid of the camera. Based on this image, the calibration or the current setting of the camera can then be examined. Therein, intrinsic parameters and/or extrinsic parameters of the camera can be adapted to perform the calibration. After the calibration of the camera has been performed, it is required to evaluate them. Thereto, a calibration error is determined. For determining the calibration error, the imaged point on the image sensor is determined, which is the projection of the reference point of the calibration object.
According to an essential aspect of the present invention, it is now provided that the back projected point is determined. This back projected point describes the back projection of the imaged point on the image sensor into the environment of the camera. Thus, it is determined, where in the environment of the camera the imaged point is actually located. The back projected point describes the back projection of the imaged point from the image sensor into the real world. In particular, the back projected point does not describe the projection of a three-dimensional point into a two-dimensional image plane. For determining the calibration error, the deviation between the reference point of the calibration object in the environment of the camera and the back projected point in the environment of the camera is then determined. If the imaged point is a two-dimensional point, the projection of this imaged point into the real three-dimensional world is a line. In order to determine the back projected point within this line, extrinsic parameters can be used. These extrinsic parameters can, in particular, describe the position of the calibration object and/or the reference point. In the present case, the calibration error is not determined within the image plane or on the image sensor, but in the real world. This is based on the realization that the determination of the calibration error in the image plane makes it impossible to set a suitable calibration error threshold, which can be used for multiple different cameras. A calibration error, which is determined within the image plane, either cannot be used as a comparative measure for different camera types or different environmental situations. Since the calibration error is presently determined in the real world, the relative location of the calibration object to the camera can also be taken into account. Further, it is allowed determining the calibration error by a single camera. Thus, the calibration can be overall evaluated in simple and reliable manner. In addition, the calibration of different cameras can be compared with each other.
Preferably, a respective back projected point is determined for a plurality of reference points of the calibration object, and the calibration error is determined on the basis of the respective deviation between the reference point and the back project point. In order to increase the accuracy in the determination of the calibration error, a plurality of reference points of the calibration object can be used. For each of the reference points, the back projected point can then be determined from the imaged point. A single calibration error can then be determined on the basis of the respective deviation between the reference point and the corresponding back projected point. The total calibration error can be determined from the individual calibration errors. For this purpose, for example, the least square method can be used. The number of reference points depends on the model of the camera to be calibrated. In particular, the number of reference points depends on the intrinsic parameters of the camera. In addition, the number of reference points depends on the geometry of the calibration object. If in the simplest case a pinhole image camera and a planar calibration object are used, a homography with eight degrees of freedom is obtained. For this purpose, four non-collinear reference points are necessary. If the camera is a camera with a fisheye lens, more reference points are required.
Preferably, a distance between the reference point and a projection center is determined and the back projected point is determined such that a distance between the back projected point and the projection center and the distance between the reference point and the projection center are identical. Considering a certain projection model, for example the central projection, a projection center can be determined. Since the position of the reference point of the calibration object is known, the distance between the reference point and the projection point can also be determined. Presently, it is assumed that the reference point of the calibration object and the back projected point have the same distance to the projection center. Therein, the back projected point describes a projection of the imaged point with respect to the projection center. Thus, the back projected point can be determined in simple and reliable manner.
According to a further embodiment, the calibration error is normalized based on the distance between the reference point and the projection center. As soon as the distance between the reference point of the calibration object and the projection center is known, it can be used to normalize or weight the calibration error. Thus, a calibration error can be used for evaluating the calibration, which is independent of the distance of the reference point of the calibration object to the projection center. Thereby, the calibration error also becomes independent of the distance of the calibration object to the camera. This allows using the calibration error also for different arrangements of cameras to calibration objects.
In addition, it is advantageous if a projected point is determined, which describes a projection of the reference point of the calibration object to the image sensor by means of a central projection. As already explained, the central projection can be used to describe the imaging of the calibration object to the image sensor of the camera. Thus, based on the position of the reference point of the calibration object, the projected point on the image sensor can be determined. The determination of the projected point can be carried out with the aid of an image processing method, for example a method of corner detection. This projected point describes the imaging of the reference point of the calibration object to the image sensor considering the projection center. Thus, a model can be provided in simple and reliable manner, which describes the imaging of the calibration object by means of the camera.
According to a further configuration, the deviation is determined based on an angle between a first line connecting the projection center and the reference point of the calibration object and a second line connecting the projection center to the back projected point. Thus, the first line describes the projection of the reference point of the calibration object on the projected point on the image sensor. The second line describes the projection of the imaged point on the image sensor to the back projected point in the environment of the camera. The first line and the second line extend through the projection center. The angle between the first line and the second line now describes the deviation between the back projected point and the reference point of the calibration object. Thus, the deviation between these points and therefore also the calibration error can be determined based on the angle in simple manner.
According to an embodiment, the deviation is determined based on a distance between the reference point of the calibration object and the back projected point. For assessing the calibration, the Euclidean distance between the reference point of the calibration object and the back projected point can in particular be determined. In particular, this can be performed in simple manner if the distance between the back projected point and the projection center and the distance between the projection center and the reference point of the calibration object, respectively, as well as the angle between the first and the second line are known.
According to a further embodiment, the deviation is determined based on an arc length of an arc, which extends through the reference point of the calibration object and the back projected point and the center point of which is the projection center. Therein, the arc can in particular be a circular arc. This arc length can be determined in simple manner if the angle between the first line and the second line is known.
Furthermore, it is advantageous if the calibration error is compared to a threshold value, wherein the threshold value is determined depending on a distance between the reference point of the calibration object and the camera. Thus, it can for example be preset, which deviations between the reference point of the calibration object and the back projected point can be tolerated depending on the distance of the calibration object to the camera. Therein, the distance between the reference point of the calibration object and the back projected point in the real world can be preset. Such a distance is therefore conceivable in simple manner.
Preferably, intrinsic parameters and/or extrinsic parameters of the camera are adapted in calibrating the camera. The extrinsic camera parameters describe the positions and/or orientations of the camera in space. The intrinsic parameters for example describe the focal length of the camera. Further, the values of an image center point represent intrinsic parameters. The pixel scaling in different spatial directions also represents intrinsic parameters. These parameters can be adapted during the calibration, and subsequently the performed calibration can be examined based on the calibration error. This is in particular suitable in large-scale production of the camera to be able to reliably calibrate it.
A camera according to the invention for a motor vehicle is calibrated according to a method according to the invention. The camera can have a motor vehicle fixing device for fixing the camera to the motor vehicle. For example, the camera can constitute a part of a camera system or driver assistance system of the motor vehicle. Therein, it is in particular provided that the camera has a fish-eye objective or a fish-eye lens. In particular, in such cameras having a fish-eye objective, the use of the calibration error defined within the real world is suitable.
A motor vehicle according to the invention includes at least one camera according to the invention. The motor vehicle is in particular formed as a passenger car. The preferred embodiments presented with respect to the method according to the invention and the advantages thereof correspondingly apply to the camera according to the invention as well as to the motor vehicle according to the invention.
Further features of the invention are apparent from the claims, the figures and the description of figures. The features and feature combinations mentioned above in the description as well as the features and feature combinations mentioned below in the description of figures and/or shown in the figures alone are usable not only in the respectively specified combination, but also in other combinations or alone without departing from the scope of the invention. Thus, implementations are also to be considered as encompassed and disclosed by the invention, which are not explicitly shown in the figures and explained, but arise from and can be generated by separated feature combinations from the explained implementations. Implementations and feature combinations are also to be considered as disclosed, which thus do not have all of the features of an originally formulated independent claim. Moreover, implementations and feature combinations are to be considered as disclosed, in particular by the implementations set out above, which extend beyond or deviate from the feature combinations set out in the relations of the claims.
Now, the invention is explained in more detail based on preferred embodiments as well as with reference to the attached drawings.
There show:
Fig. 1 a motor vehicle according to an embodiment of the present invention, which has a plurality of cameras; and
Fig. 2 a schematic representation of a projection of a calibration object to an
image sensor of the camera.
In the figures, identical and functionally identical elements are provided with the same reference characters.
Fig. 1 shows a motor vehicle 1 according to an embodiment of the present invention in a plan view. Presently, the motor vehicle 1 is formed as a passenger car. The motor vehicle 1 includes a camera system 2, which includes an electronic controller 3. Moreover, the camera system 2 includes a plurality of cameras 4. In the present embodiment, the camera system 2 includes four cameras 4. Therein, one camera 4 is arranged in a front area 7 of the motor vehicle 1 , one camera 4 is arranged in a rear area 5 of the motor vehicle 1 and two cameras 4 are arranged in respective lateral areas 6 of the motor vehicle 1 . The number and arrangement of the cameras 4 at the motor vehicle 1 are to be purely exemplarily understood.
Images can be provided by the cameras 4, which describe the environment 8 of the motor vehicle 1 or of the cameras 4. The cameras 4 are connected to the electronic controller 3 for data transfer. Corresponding data lines are presently not illustrated for the sake of clarity. The images provided by the cameras 4 can be evaluated with the aid of the controller 3. The cameras 4 in particular have fish-eye objectives.
In order to ensure reliable operation of the cameras 4, it is required to correspondingly calibrate the cameras 4. For calibrating the camera 4, a calibration object 9 is positioned in the environment 8 of the camera 4. Fig. 2 shows a schematic representation of such a calibration object 9 and a schematic representation of an image sensor 10 of the camera 4. An image of the calibration object 9 is captured with the aid of the camera 4. Therein, a reference point P1 of the calibration object 9 is imaged to the image sensor 10 such that an imaged point P2 results. Moreover, a projected point P3 is determined on the image sensor 10, which describes the projection of the reference point P1 to the image sensor 10. For the projection of the reference point P1 to the image sensor 10, presently, a central projection is used, which describes a projection center C. Based on this central projection, a back projected point P4 can also be determined, which describes a back projection of the imaged point P2 into the environment 8 of the camera 4.
Therein, a distance d between the projection center C and the reference point P1 of the calibration object 9 can be determined. In addition, the back projected point P4 is determined such that it has the same distance d to the projection center C as the reference point P1 of the calibration object 9. Further, a first line G1 is shown, which connects the projection center C to the reference point P1 . A second line G2 connects the projection center C to the back projected point P2. An angle Θ describes the angle between the first line G1 and the second line G2. Lines G1 and G2 are straight lines. The connection between the projection center C and the back projected point P2 as well as the connection between the projection center C and the projected point P3 are also represented as straight lines in the present case. These connections may also be different from a straight line. This depends on the camera model used. Based on the deviation between the reference point P1 and the back projected point P4, a calibration error can be determined, which describes the calibration of the camera 4.
Therein, a distance L2 or a Euclidean distance between the three-dimensional data points P1 and P4 can be determined. The distance L2 can be determined according to the following formula based on the distance d between the projection center C and the reference point P1 as well as the angle Θ between the first line G1 and the second line G2:
L2 = dJ2(l - cos6).
It can also be provided that an arc length a is determined, which describes the distance between the reference point P1 and the back projected point P4. Therein, the arc length a can be determined along a circular path of a circle extending through the projection center C according to the following formula: a = Θ d.
The distance L2 and the arc length a are values, which can be intuitively and simply understood. They describe an error, which can for example be indicated in millimeters and which can be indicated for evaluating the error in the three-dimensional space. Thus, it is allowed calibrating the camera 4 for different settings and conditions. In addition, the calibration of the camera 4 can be assessed in simple manner based on the calibration error. The greater the calibration error, thus, the distance L2 or the arc length a, the worse the calibration. In addition, two cameras 4 different from each other, which for example have different capturing ranges or image sensors, can also be compared to each other.
Further, a threshold value S1 , S2 can also be defined by a user, to which the calibration error can be compared. This threshold value S1 , S2 can be determined depending on a distance of the calibration object 9 or the reference point P1 to the camera 4. For example, with a distance of 2 m between the camera 4 and the calibration object 9, an error of 50 mm can be accepted. With a distance of 5 m to the camera 4, for example, a deviation of 100 mm can be accepted. With a simple linear interpolation, the threshold values S1 , S2 can be defined for different distances and be compared to each other.
Therein, it can also be provided that the calibration error or the distance L2 or the arc length a are normalized in order that they can be used for different camera models and different calibration settings in simple manner. Therein, the normalized distance L2n can be determined according to the following formula:
The normalized arc length an can be determined according to the following formula „ = Θ.
The threshold values too can be defined according to the same logic. Therein, a user can define a maximally acceptable distance L2max or a maximally acceptable arc length amax depending on the distance d. The threshold value S1 for the distance L2 results according to the following formula:
~ d
The threshold value S2 for the arc length a results according to the following formula: ατηαχ
The respective calibration error can be determined for a plurality of reference points P1 of the calibration object 9. Thus, the calibration of the camera 4 can overall be evaluated in reliable manner.

Claims

Claims
Method for calibrating a camera (4) for a motor vehicle (1 ), in which a calibration object (9) is positioned in an environment (8) of the camera (4), an image of the calibration object (9) is captured by the camera (4), the camera (4) is calibrated depending on the captured image and a calibration error is determined for evaluating the calibration, wherein a point (P2) of a reference point (P1 ) of the calibration object (9) imaged to an image sensor (10) of the camera (4) is determined for determining the calibration error,
characterized in that
a back projected point (P4), which describes the back projection of the imaged point (P2) into the environment (8) of the camera (4), is determined and the calibration error is determined based on a deviation between the reference point (P1 ) of the calibration object (9) and the back projected point (P4).
Method according to claim 1 ,
characterized in that
a respective back projected point (P4) is determined for a plurality of reference points (P1 ) of the calibration object (9) and the calibration error is determined on the basis of the respective deviation between the reference point (P1 ) and the back projected point (P4).
Method according to claim 1 or 2,
characterized in that
a distance (d) between the reference point (P1 ) and a projection center (C) is determined and the back projected point (P4) is determined such that a distance (d) between the back projected point (P4) and the projection center (C) and the distance (d) between the reference point (P1 ) and the projection center (C) are identical.
4. Method according to claim 3,
characterized in that
the calibration error is normalized based on the distance (d) between the reference point (P1 ) and the projection center (C).
5. Method according to claim 3 or 4,
characterized in that
a projected point (P3) is determined, which describes a projection of the reference point (P1 ) of the calibration object (9) to the image sensor (10) by means of a central projection.
6. Method according to any one of claims 3 to 5,
characterized in that
the deviation is determined based on an angle (Θ) between a first line (G1 ) connecting the projection center (C) and the reference point (P1 ) of the calibration object (9) and a second line (G2) connecting the projection center (C) to the back projected point (P4).
7. Method according to any one of the preceding claims,
characterized in that
the deviation is determined based on a distance (L2) between the reference point (P1 ) of the calibration object (9) and the back projected point (P4).
8. Method according to any one of claims 1 to 6,
characterized in that
the deviation is determined based on an arc length (a) of an arc, which extends through the reference point (P1 ) of the calibration object (9) and the back projected point (P1 ) and the center point of which is the projection center (C).
9. Method according to any one of the preceding claims,
characterized in that
the calibration error is compared to a threshold value, wherein the threshold value is determined depending on a distance between the reference point (P1 ) of the calibration object (9) and the camera (4).
10. Method according to any one of the preceding claims,
characterized in that
intrinsic parameters and/or extrinsic parameters of the camera (4) are adapted in calibrating the camera (4).
1 1 . Camera (4) for a motor vehicle (1 ), which is calibrated according to a method according to any one of the preceding claims.
12. Camera (4) according to claim 1 1 ,
characterized in that
the camera (4) has a fish-eye objective.
13. Motor vehicle (1 ) with at least one camera (4) according to claim 1 1 or 12.
PCT/EP2018/050637 2017-01-16 2018-01-11 Method for calibrating a camera for a motor vehicle considering a calibration error, camera as well as motor vehicle WO2018130605A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102017000307.6A DE102017000307A1 (en) 2017-01-16 2017-01-16 Method for calibrating a camera for a motor vehicle taking into account a calibration error, camera and motor vehicle
DE102017000307.6 2017-01-16

Publications (1)

Publication Number Publication Date
WO2018130605A1 true WO2018130605A1 (en) 2018-07-19

Family

ID=60955066

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/050637 WO2018130605A1 (en) 2017-01-16 2018-01-11 Method for calibrating a camera for a motor vehicle considering a calibration error, camera as well as motor vehicle

Country Status (2)

Country Link
DE (1) DE102017000307A1 (en)
WO (1) WO2018130605A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109458990A (en) * 2018-11-08 2019-03-12 华南理工大学 A kind of instrument and equipment pose measurement and error compensating method based on the detection of label-free anchor point
CN112785519A (en) * 2021-01-11 2021-05-11 普联国际有限公司 Positioning error calibration method, device and equipment based on panoramic image and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020206354A1 (en) * 2020-05-20 2021-11-25 Siemens Mobility GmbH Method for calibrating one or more environment sensors arranged on a rail vehicle

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150288951A1 (en) * 2014-04-08 2015-10-08 Lucasfilm Entertainment Company, Ltd. Automated camera calibration methods and systems
US20160210750A1 (en) * 2015-01-16 2016-07-21 Magna Electronics Inc. Vehicle vision system with calibration algorithm

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10118514B4 (en) 2001-04-16 2005-08-18 Vmt Bildverarbeitungssysteme Gmbh Method for operating point stabilization in contactless 3D position detection of an object to be measured by means of digital cameras
JP4533824B2 (en) 2005-08-30 2010-09-01 株式会社日立製作所 Image input device and calibration method
DE102014117888A1 (en) 2014-12-04 2016-10-13 Connaught Electronics Ltd. Online calibration of a motor vehicle camera system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150288951A1 (en) * 2014-04-08 2015-10-08 Lucasfilm Entertainment Company, Ltd. Automated camera calibration methods and systems
US20160210750A1 (en) * 2015-01-16 2016-07-21 Magna Electronics Inc. Vehicle vision system with calibration algorithm

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HOANG VAN-DUNG ET AL: "Automatic calibration of camera and LRF based on morphological pattern and optimal angular back-projection error", INTERNATIONAL JOURNAL OF CONTROL, AUTOMATION AND SYSTEMS, KOREAN INSTITUTE OF ELECTRICAL ENGINEERS, SEOUL, KR, vol. 13, no. 6, 26 September 2015 (2015-09-26), pages 1436 - 1445, XP035961342, ISSN: 1598-6446, [retrieved on 20150926], DOI: 10.1007/S12555-014-0287-X *
HOLD S ET AL: "Efficient and robust extrinsic camera calibration procedure for Lane Departure Warning", INTELLIGENT VEHICLES SYMPOSIUM, 2009 IEEE, IEEE, PISCATAWAY, NJ, USA, 3 June 2009 (2009-06-03), pages 382 - 387, XP031489871, ISBN: 978-1-4244-3503-6 *
KANNALA J ET AL: "A generic camera calibration method for fish-eye lenses", PATTERN RECOGNITION, 2004. ICPR 2004. PROCEEDINGS OF THE 17TH INTERNAT IONAL CONFERENCE ON CAMBRIDGE, UK AUG. 23-26, 2004, PISCATAWAY, NJ, USA,IEEE, LOS ALAMITOS, CA, USA, vol. 1, 23 August 2004 (2004-08-23), pages 10 - 13, XP010724135, ISBN: 978-0-7695-2128-2, DOI: 10.1109/ICPR.2004.1333993 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109458990A (en) * 2018-11-08 2019-03-12 华南理工大学 A kind of instrument and equipment pose measurement and error compensating method based on the detection of label-free anchor point
CN109458990B (en) * 2018-11-08 2020-12-22 华南理工大学 Instrument and equipment pose measurement and error compensation method based on mark-free anchor point detection
CN112785519A (en) * 2021-01-11 2021-05-11 普联国际有限公司 Positioning error calibration method, device and equipment based on panoramic image and storage medium

Also Published As

Publication number Publication date
DE102017000307A1 (en) 2018-07-19

Similar Documents

Publication Publication Date Title
JP4690476B2 (en) Car camera calibration system
JP4751939B2 (en) Car camera calibration system
JP4803450B2 (en) On-vehicle camera calibration device and vehicle production method using the device
JP4803449B2 (en) On-vehicle camera calibration device, calibration method, and vehicle production method using this calibration method
JP5081313B2 (en) Car camera calibration system
JP4636346B2 (en) Car camera calibration apparatus, method, and program
US9361687B2 (en) Apparatus and method for detecting posture of camera mounted on vehicle
WO2010109730A1 (en) Camera calibrator
WO2010113672A1 (en) Calibration indicator used for calibration of onboard camera, calibration method of onboard camera using calibration indicator, and program for calibration device of onboard camera using calibration indicator
CN108886606B (en) Mounting angle detection device, mounting angle calibration device, and mounting angle detection method for in-vehicle camera
JP2009288152A (en) Calibration method of on-vehicle camera
WO2018130605A1 (en) Method for calibrating a camera for a motor vehicle considering a calibration error, camera as well as motor vehicle
KR20160090677A (en) Parking guide system and method for controlling the same
KR20160077684A (en) Apparatus and method for tracking object
WO2014045344A1 (en) Foe setting device and foe setting method for on-vehicle camera
JPWO2018042954A1 (en) In-vehicle camera, adjustment method of in-vehicle camera, in-vehicle camera system
JP5175230B2 (en) Automatic camera calibration apparatus and automatic calibration method
JP6669182B2 (en) Occupant monitoring device
US20160121806A1 (en) Method for adjusting output video of rear camera for vehicles
JP6450530B2 (en) In-vehicle camera mounting angle adjustment processing, mounting angle detection device
JP5173551B2 (en) Vehicle perimeter monitoring apparatus and camera mounting position / posture information setting correction method applied thereto
JP2009212734A (en) Automatic calibration monocular stereo vision device
JP2001272210A (en) Distance-recognition apparatus
JP4905812B2 (en) Camera calibration device
JP2021099722A (en) Calibration device, imaging apparatus, movable body, and calibration method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18700290

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18700290

Country of ref document: EP

Kind code of ref document: A1