WO2012139660A1 - Online vehicle camera calibration based on road marking extractions - Google Patents

Online vehicle camera calibration based on road marking extractions Download PDF

Info

Publication number
WO2012139660A1
WO2012139660A1 PCT/EP2011/056032 EP2011056032W WO2012139660A1 WO 2012139660 A1 WO2012139660 A1 WO 2012139660A1 EP 2011056032 W EP2011056032 W EP 2011056032W WO 2012139660 A1 WO2012139660 A1 WO 2012139660A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
camera
road
points
online calibration
Prior art date
Application number
PCT/EP2011/056032
Other languages
French (fr)
Inventor
Myles Friel
Derek Anthony Savage
Ciaran Hughes
Peter Bone
Original Assignee
Connaught Electronics Limited
Application Solutions (Electronics and Vision) Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Connaught Electronics Limited, Application Solutions (Electronics and Vision) Limited filed Critical Connaught Electronics Limited
Priority to PCT/EP2011/056032 priority Critical patent/WO2012139660A1/en
Publication of WO2012139660A1 publication Critical patent/WO2012139660A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Abstract

The invention relates to a method for online calibration of a vehicle video system evaluated from image frames of a camera containing longitudinal road features. A portion of a road surface is captured by the camera in an image frame. An identification of longitudinal road features within the image frame is performed. Points are extracted along the identified longitudinal road features to be transformed to a virtual road plan view via a perspective mapping taking into account prior known parameters of the camera. The transformed extracted points are analysed with respect to the vehicle by determining a possible deviation of the points from a line parallel to the vehicle while any measured deviation being used to define an offset correction of the camera.

Description

ONLINE VEHICLE CAMERA CALIBRATION BASED ON

ROAD MARKING EXTRACTIONS

BACKGROUND OF THE INVENTION FIELD OF THE INVENTION

[0001] This i nvention relates generally to an online calibration of a vehicle video system, and particularly to a method for online calibration of {he vehicle video system evaluated from image frames of a camera, It is also related to a computer program product for processing data relating to the online calibration of the vehicle video system, the computer program product comprising a computer usable medium having a computer usable program code embodied therewith, the computer program code being configured to perform any of the method steps, The present invention is further related to an online calibration system for a vehicle video system for processing the computer implemented online calibration method.

DESCRIPTION OF BACKGROUND

[0002] It is known to mount image capture devices, such as, for example, digital or analogue video cameras, on a motor vehicle in order to produce a video image of an aspect of the environment exterior of the vehicle, For example, in order to assist in parking and manoeuvring a vehicle in confined spaces, it is known to mount such image capturing devices on respective opposite sides of the vehicle, for example, on side rear view mirror housings which extend sidewardly from the driver and front passenger doors of the vehicle. The image capture devices are mounted i the side rear view mirror housings with the field of view of the image capture devices directed downwardly towards tiie ground for capturing plan view images of the ground on respective opposite sides of the vehicle adjacent the vehicle. Typically, a visual display unit is located in the vehicle, either in or on the dashboard, or in a location corresponding to that of a conventional interiorly mounted rear view mirror. Nowadays, also head-up displays are used for vehicle. Even, a projection onto the windscreen is now possible. When a driver is undertaking a parking Manoeuvre or a manoeuvre in a confined space, a view image of the vehicle with the respective view images of the ground on respective opposite sides of the vehicle can be displayed on the visual display unit. The view display of the vehicle and the ground on respective opposite sides of the vehicle assists the driver in parking, nd in particular, carrying out a parking manoeuvre far parking the vehicle in a parking space parallel to a kerb of a footpath or the like.

10003] However, in order that the view images of the ground accurately reflect the positions of objects relative to the vehicle, which are captured in the images, it is essential that the view images of the ground juxtapositioned with the view image of the vehicle should accurately represent a top view of the ground adjacent the respective opposite sides of the vehicle exactly as would be seen when viewed from above, in other words, the edges of the respective view images of the ground which extend along the sides of the view image of the vehicle must correspond directly with the edge of the ground along the sides of the vehicle when viewed i view from a position above the vehicle. Otherwise, the positions of objects in the respective view images of the ground will not be accurately positioned relative to the vehicle. For example, if the edge of one of the view images of the ground adjacent the corresponding side of the view image of the vehicle corresponds with a portion of a view of the ground which is spaced apart from the side of the vehicle, then the positions of objects in the view image of the ground will appear closer to the vehicle in the image than they actually arc. Conversely, if one of the image capture devices is mounted on side mirror housing so that an image of a portion of the ground beneath a side of the vehicle is captured, the positions of objects captured in the view image will appear farther away from the vehicle than they actually are, with disastrous results, particularly if a driver is parking the vehicle parallel to a wall or bollards. Similar requirements apply also for front or rear placed image capture devices. Often, the most obvious effect of poor calibration happens when a merged image is presented to the user and one of the cameras is not correctly calibrated, in this case an Object/feat re on the ground can appear disjointed or elongated or disappear completely from the view presented to the user. [00041 Accordingly, it is essential that the view images of the ground when displayed on the visual display screen juxtapositioncd along wi the view image of the vehicle must: be representative of views of the ground on respective opposite sides of the vehicle exactly as would be: seen from a top view of the ve icle and adjacent ground. In order to achieve such accuracy, the image capture devices would have to be precision mounted on the vehicle. In practice this is not possible, Accordingly, i order to achieve the appropriate degree of exactness and accuracy of the view images of the ground relative to the view image of the vehicle, it is necessary to calibrate the outputs of the image capture devices. Calibration values determined during calibration of the image capture devices are then used to correct subsequently captured image frames for offset of the image capture devices from ideal positions thereof, so thai view images of the ground subsequently outputied for display with the view image of the vehicle are exact representations of the ground on respective opposite sides of the vehicle. Such calibration cap be accurately carried out in 8 factory during production of the motor vehicle. The calibration may also use the absolute position/rotation of the camera. Typically, the image capture devices are relatively accurately fitted in the side mirror housings of the motor vehicle, and by using suitable grid patterns on the ground, calibration can be effected. Ho ever^ the environments In which motor vehicles must operate are generall relatively harsh environments, in that side mirror housings are vulnerable to impacts with other vehicles or stationary objects. While such impacts may not render the orientation f the side mirror housing unsuitable for producing an adequate rear view from a rear view mirror mounted therein, such impacts can and in general do result in the image capturing device mounted therein being knocked out of alignment, in other words, being offset from its ideal position. Additionally, where a vehicle is involved in a crash, or alternatively, where a side mirror housing requires replacement, ^calibration of the image capture device refitted in the new side mirror housing will be required. Such re* calibration, which typically would be carried out using a grid pattern on the ground, is unsatisfactory, since in general, it is impossible to accurately align the vehicle svith the grid pattern in order to adequatel calibrate the image capture device, unless the calibration is being carried out under factory conditions. The same applies for rear or front placed image capture devices even in the case, those devices are placed in the interior of the vehicle since harsh conditions apply also there,

ID OSI in WQ2OO9 027O90 is described a method and system for online calibration of a vehicle video system using vanishing points evaluated from frames of a camera image containing identified markings or edges on a road. In US2009/0290032 is escribed a system and method for calibrating a camera on a vehicle by identifying at least two feature points i at least two camera images from a vehicle that has moved between taking the images. The method then determines a camera translation direction between two camera positions. Foi lowing this, the method determines a ground plane in camera coordinates based on the corresponding feature points from the images and the camera translation direction. The method then determines a height of the camera above the ground and a rotation of the camera in vehicle coordinates.

SUMMARY OF THE INVENTION

[0006] In view of the above, there is a need for a method and a calibration system for calibrating an output of an image capture device or camera mounted on a vehicle to compensate for offset of the camera from an ideal position, such method and calibratio system being based on a simple use of available longitudinal road features without requiring a palibration of the camera to be carried out in a controlled environment.

[0007] According to a first aspect of the embodiment of the present invention, a method for online calibration of a vehicle video system is evaluated from image frames of a camera containing longitudinal road features. The method comprises the following steps of capturing by the camera of a portion of the road surface m an image. Then longitudinal road features are identified within the image frame. Points along the identified longitudinal features are extracted and transformed to a virtual road plan view via perspective mapping taking into account prior known parameters of the camera. An analysis of the transforme extracted points is performed wi th respeet to the vehic le to determine a possible deviation of t e points from a l ine paral lel to a line alongside the vehicle, The consequently determined deviation is then applied for an online calibration of the camera.

[0008] Advantageously, the determined deviation is applied as error measure to be minimised when adjustin rotation parameters used for the calibration of the camera. In an embodiment according to the invention the extracted points along identified longitudinal road features within sequence of image frames are analysed in the transformed virtual road plan view and stored over a period of time. Such stored data are then used for a determination of a deviation of the points to be applied as error measure to be minimised when adjusting rotation parameters for the calibration of the camera, in both alternatives, binary search method can be applied when minimising the error measure. Also a nonlinear optimization method can be applied when minimising the error measure. Alternatively, the calibration is calculated for each frame and the result is stored. The calibrations from all frames are finally averaged.

[0009] In some embodiment according to the invention the calibration of the camera is performed about an y-axis transverse to the vehicle and parallel to the road plan and about an z-a is transverse to the vehicle and perpendicular to the road plan.

[0010] Advantageously, longitudinal road features more then a predefined distance away from the car are used for the calibration of the camera about an y-axis transverse to the vehicle and parallel to the road plan. Also longitudinal road features less then a predefined distance away from the car can be used for the calibration of the camera about an ?.-axis transverse to the vehicle arid perpendicular to the road plan.

{00111 It is further possible according to the invention to use steering angle of the vehicle when transfonning the extracted points to the virtual road plan view.

[0012] It is also possible to analyse the extracted points to be rejected as outlier if fulfilling some criteria. Advantageously, the extracted points are rejected when analysing the transformed extracted points irt the case a fine with an angle curvature greater than predefined value is built. 0013} According to a second aspect of the embodi ent, a computer program product for processing data relating to online calibration of a vehicle video system is also described and claimed herein, The computer program product comprises a computer usable medium having computer usable program code embodied therewith, the computer usable program code bein configured to perform the above summarized methods.

10014] According to a third aspect of the embodiment, an online calibration system for a vehicle video system is also described and claimed herein. The online calibration system comprises a computer program product for processing data relating to a online calibration method and an image processing apparatus with a camera for taking image frames to be used by the online calibration method such to perform the above

summarized methods. jOOtS] Additional features and advantages are realized through the techniques of the present invention. Other embodiments and aspects of the invention are described in detail herein and are considered a part of the claimed invention. For a better

understanding of the invention with advantages and features, refer to the descript ion and to the drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

[0016] The subject matter which is regarded as the invention is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other objects, features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:

10017) FIG. 1 illustrates a side view and a top view of a vehicle with a 3d co-ordinate system; 0018) FIG. 2 illustrates an example : Of longitudinal (Oad features tracking for a wing mirror camera according to the invention; (0ΘΙ?3 FIG. 3 illustrates an example of longitudinal road features tracking far a wing mirror fish eye camera according to the invention;

[0020] FIG. 4 illustrates an example of a virtual road plan view according t« the

invention;

[0021] FIG, 5 illustrates an example of a virtual road plan view according to the

invention;

[0022] FIG, 6 illustrates an example of a virtual road plan view according to the

inventions

[0023] FIG, 7 illustrates an example of a virtual road plan view according to the

invention.

[0024] The detai led description explains lite preferred embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.

IJRXAiLEJ) DESCHtPTlON OF THE INVENTION

[0025] As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.

[0026] Any combination of one or more computer readable- medium(s) may be utilized, The com uter readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing, More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following; an electrical connection having one or more wires, a portable computer diskette, a hard disk, random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or my suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by o in connection wit an instruction execution system, apparatus, or device.

{00273 A computer readable signal medium ma include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.

(0028) Program code embodied on a computer readable medium may be transmitted within the vehicle using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, R]\ etc.. or any suitable combination of the foregoing.

[002?] Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming I ariguages. The program code may execute entirel on the vehicle's computer, partly on the vehicle's computer, as a stand-alone software package, partly on the vehicle's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the vehicle's computer through any type of i eless network, including a Wireless local area network ( WLAN), possibly but not necessarily through the Internet using an Internet Service Provider.

[0030] Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of th invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general ptlrpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, creaie means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.

[0031] These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processi apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce art article of manufacture including instructions which implement the function/act specified in the flowchart arid/or block diagram block or blocks,

[0032] The computer program instructions may idSp be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementi g the functions/acts specified in the flowchart and/or block diagram block or blocks. [0033] Present invention is a means of calibrating a camera on a vehicle to determine the extrmsic camera, rotation parameters relative to the vehicle co-ordinate system when the camera is positioned in an undetermined relevant position on the vehicle od ;

Calibration allows the vehicle manufacturer to provide a geometrically representative and more importantly a useful view to a vehicle user.

[0034] Turning now to the drawings in greater detail, it will be seen that in figure 1 is shown a side view and a top view of a vehicle I with a defined three dimensional coordinate system XYZ as used in the following, X-axis is chosen along the longitudinal direction, Y-axis along the transverse direction and Z-axis along the vertical direction of the vehicle I ,

[0035] The camera, whether placed at rear, front or side usually wing-mirror, captures a significant portion of road surfaces in the scene. Such portion of road surfaces is stored in image frames containing some longitudinal road features like markings or edges. Present invention is preferably adapted for camera placed at the side of the vehicle but could be adapted also for rear or front cameras. Alternately, present invention could be used for the side camera and combined with other online calibration. Ac ordinglyj presen invention is best suitable for the estimation of the rotation of the camera about the y- and the z-axes (see fig. 1 ).

[0036] Figure 2 shows an image taken by a side camera of the vehicle 1. The road ground 20 comprises road markings here broken (dashed) lines 21 as well as edge line 22 at the edge 23 of the road, Present invention could be applied for any sort of road marking like solid lane line or even dotted lane line. Furthermore* it is also applicabl when no marking at all is present by using then the edge 23 of the road as longitudinal road features. On figure 2 is also visible the horizo line 24.

[0037} Longitudinal road features (here broken fines 21 and edgeline 22 on fig. 2) are identified within the image frame captured by the camera. Any adequate method can be used to extract those longitudinal road features ..from the camera image. Preferably, such longitudinal road features are extracted by analyzing several columns of raw video data within a predetermined region of interest of the frame at regular vertical intervals to detect light colored blobs* This can be achieved by looking for rising edge ' followed by a falling edge, a method also applicable for road edge 23. A blob is accepted only if its average luminance is significantly greater than the surrounding road, For road marking like For markings 21 and 22, a further criterion can be ap lied namely that Us width Ks within road marking min and max constraints, usually well define through some standards whi le possibly country dependent.

[0038] Once longitudinal road features are identified then points 26, 27 can be extracted along them. Those points 26, 27 are used on a virtual road plan view for the online calibration. Similar method can be applied for a side camera with a fish eye or wide-angle like optic. Fig. 3 shows a similar image of road ground as shown on fig. 2 but this time taken using some side camera with a fish eye lens. The road ground 30 comprises again some road markings as broken lines 1 and edgeline 32, The road markings 31 , 32 as well as the road edge 33 and the horizon 34 show a clear distortion typical for a camera with a wide-angle optic. The characteristics of the camera being known* it is possible to subtract the distortion coming from the wide-angle optic (fish eye lens) when identifying the longitudinal road features 31 , 32, 33, In a same way as in the case of figure 2, points 36, 37 can be extracted here from the road markings respectively broken line 3 U and edgeline 32.

[0039} The extracted points (26, 27 for Fig. 2 and 36, 37 for Fig, 3 ) are transformed to a victual road view perspective mapping, Several parameters like the initial or the last defined position of the camera are preferably but not necessarily been taken for such mapping. The initial position or location as well as possible rotation along the x-axis (see Fig, I ) are fixed possibly by some previous calibration. Also an initial estimate of a rotation about the y- and z-axes can be considered possibly from mechanical data. Any other intrinsic parameters from the camera can advantageously be taken into account.

[00401 The transformation of the extracted points to a perspective image can be performed using usual rotation matrices (see e.g. h(tfi://en. ikipedia;org w1ki/Bu1.erj¾<i 1e^ and camera projection matrices (see

http: ^vw iS iiic.edu/^marc/tutori¾i/node39,html^T Given the fact that the road markings are known to be o the ground plane, each of the extracted points in the image can be transformed to a point on the road plane. With a fish eye camera, such transformation must take into account the fish eye projection function (see e.g.

http: /e .wikipedia.org/ vvik i/Fisheye lens, though my projection function can be used). Once lines have been detected along the extracted points, they are then mapped to the ground to adjust the camera rotation to make the fine parallel to the vehicle.

[0041] Figure 4 shows accordingly a projection of the extracted points to a virtual road plan view with the road 40 and the vehicle 41 . Longitudinal road features like broken lines 42 in the middle of the road and ed e! ines 43, 4 at the right and the left of the road are represented together with the right edge 45 of the road. The representation on figure 4 corresponds to a left-hand driving situation typical for some countries like the U or Ireland. Also shown are the points 46 extracted along the broken lines 42 as well as the points 47 extracted along the edgeline 43. It is clearly visible from figure 4 that the extracted points 46 and 47 do not follow the corresponding chosen longitudinal road features i.e. are not parallel to the vehicle x-axis. This comes from the fact that the y- and /-rotations of the selected camera to calibrate are considered with incorrect initial parameters*

[0042] The advantage of present invention is currently to adjust y- and /-rotation parameters such that the re-projected points form lines that are parallel to the vehicle x- axis. The introduced error-function, therefore, becomes the measure of how parallel the projected points are to the vehicle x-axis, A minimizatio algorithm based on simple binary search (usually one angle camera at a time is solved) can be used to estimate the parameters ( and z rotatio parameters) that minimize this error measure, initial rotatio parameters are chosen as starting point for the solver.

[0043] On fi ure 5 is shown a similar plan view of the virtual road 50 as in figure 4 with the vehicle 51 , the longitudinal road features with markings 52 (broken lines), 53, 5 (ri ht and left edgelines) ahd right edge of the road 55, Als are shown the re-projected points §6, 57 after the solver selects the correct y- and ^rotations. The points 56, 57 follow no the respective !on itudinsl road features in a parallel way while still Hot matching them- This is due to the fact that the x-axis rotation parameter cannot be estimated with the use of present method.

{0044] In case such x-axis rotation parameter is available using ¾h alternative method then it is possible to complete the online calibration taking into account latter parameter; Figure 6 shows the final result with a cal ibration on the three axes with the road 60, the vehicle 61 , the longitudinal road features 62, 63, 64, 65 and the projected points 66 and 67.

[0045) It appears that the primary cause of non-parallelism to the vehicle x-axis is rotation about the y-axis for lines that are parallel to the side of the vehicle and more than a given distance Am away. Thus, the problem of online calibration is preferably divided into two problems, namely lines that are more than A meters away are only used to determine the rotation about the y-axis. A is set to 2m but other values could be chosen as a predefined parameter (stored in some memory let avai lable for the online calibration). Such parameter can be adjusted according to the camera being calibrated and the location of the camera on the vehicle,

[0046] In a similar way, to reduce the dependence of the z-axis calibration on the previous y-axis calibration, only lines within Bm of the vehicle can be used in the calibration of the z-axis rotation. B can be chosen as a predefined parameter also to be stored in some memory let available for the online calibration. Such parameter can be adjusted based on the camera type and position on the vehicle, B is set to 0.5m but also other values could be chosen according to some criteria possibly related to the kind of camera to be calibrated,

[0047] Figure 7 shows a similar plan view as in figure 6 now with selections of the different extracted points on the one hand for the calibration for z-axis rotation and on the other hand for the calibration for y-axis rotation. On the road 70 with the vehicle 71 are shown the different longitudinal road features 72, 73, 74, 75 as well as the extracted points 76 and 77. In the considered case of the calibration of a camera at the right side of t re vehicl 1 and left-hand driving situation, the points 76 extracted from the broken lines 72 are the ones close to the vehicle. In the case those extracted points 76 are wit i the predefined interval B possibly defined from the approximate trajectory 78 (broken line) of the camera (wing-mirror camera) outwards and set to be equal to 0.5m then those points 76 are considered exclusively for the calibration along the z-a is rotation. In the same way, the points 77 extracted from the edgeline 73 are the ones far to the vehicle. Therefore, those extracted points 77 being within the predefined interval A possibly defined from the approximate trajectory 78 of the camera outwards and set to be equal to 2m then those points 77 are considered exclusively for the calibration along the y-axis rotation.

100481 Above described online calibration can be applied in a similar way for a camera placed at the left side of the vehicle. Also, different longitudinal road features can be selected from which points can be extracted. For example, in case no markings or the markings are not clearly visible then the edge of the road can be used as a longitudinal feature for the online calibration.

[0049] In an embodiment according to the invention, lines that are extracted (using the extracted points) can be stored over a period of time. Thus, a large set of lines can be stored and a minimization algorithm can be used to determine th y- and z-axes rotations. For that, extracted lines are stored for some possibly predefined length oftime/number of image frames taken by the camera to be calibrated , Once a number of edges have been extracted at different points in time, then the rotation parameters are determined vising a search or error minimization algorithm. Currently, a binary search method

rhttp: /en,w kipiedia.or^/vv'iki/Binary seai¾h algorithm) is used possibly combined With the use of the /i and B intervals to differentiate between the calibration along the y- and th^ -5-axes, Any bjbm e^rQ r^eiro rninittn^tfqn algorithm could also be applied here. Also a more complex algorithm like Nelder- ead shall be preferably used when the y- and z-axes rotations calibration are solved together. 100501 AdvantageOUsl , the steering angle of the vehicle can be currently used to ensure (hat the vehicle is travelling along a straight direction when online calibration shall be applied. Additionally, a rejection criterion can be defined for out!iner such that a line detected with art angle greater than a given threshold could be rejected. The threshold can be predefined possibly according to some experience collected in advance. Alternately, the constraint that the tracks are parallel to the x~axis of the vehicle could be eliminated for by using the steering information or by having a range of predefined expected steering curvature. If initial estimates of the camera extrinsic parameters are known (e.g. from vehicle mechanical data), they can be used as starting points fo the calibration.

Advantageously, the speed and steering information possibly available on the vehicular network (e.g. via CAN, UN, Wireless or other) can be used when transforming the extracted points to remove (relax) the necessity for the vehicle to be moving in a straight line and at constant speed.

[0051] The criteria for rejecting extracted points as outliers using the velocity of the vehicle could be based on the fact that when a vehicle is travelling faster, it is more likely to be travelling parallel to th longitudinal road features like the markings or the edge of the road. In contrast, when a vehicle is travellin slower (e.g. at junctions and roundabouts, etc.) it is likely that the road features captured by the camera are not actually parallel to the direction of the vehicle motion (i.e, parallel to the vehicle x-axis). Road-marking color information could also be considered as rejection criteria possibly but not necessarily in combination with the vehicle velocity, For example, if a green road blob is detected in areas where only white or yellow/orange markings are expected, it is highly likely that this is an erroneous detection and should be rejected as outlier. fOOSZj Online calibration for camera of a vehicle video system according to the present invention is particularly suitable above a minimum speed of the vehicle e,g. above 5.0 km/h. Therefore, it is of advantage to combine the online calibration according to the present invention with other online calibration methods. j0053) The capabilities of the present in vention can be in^pt^inented in software* firmware, hardware or some combination thereof.

[0054] As one example^ orte or more aspects of the present invention can be included in an article of manufacture (e.g., one or more computer program products) having, lor instance, computer usable media. The media has embodie therein, for instance, computer readable program code means for providing and facilitating the capabilities of the present invention. The article of manufacture can be included as a part of a computer system or sold separately. Additionally, at least one program storage device readable by a machine, tangibly embodying at least one program of instructions executable by the machine to perform the capabilities of the present invention can be provided.

[0055] While the preferred embodiment t the invention has been described, it wi ll be understood that those skilled i the art, both now and in the future, may make various improvements and enhancements which fall within the scope of the claims which follow. These claims should be construed to maintain the proper protection for the invention first described.

Claims

I., A method for Online calibration of a vehicle video system evaluated from image frames of a camera containing longitudinal road features, the method comprising the following steps of:
* Capturing by the camera of portion of the road surface (20, 30, 40, 50, 60, 70) in an image;
♦ Identifying longitudinal road features (21-23, 1 -33, 42-45, 52-55, 62-65, 72-75) within (he image frame;
The method being characterised by the further steps of
» Extracting points (2$,. 27, 36, 37, 46, 47, 56, 57, 66, 67, 76, 77) along the identified longitudinal road features;
• Transforming the extracting points to a virtual road plan view via perspective mapping taking into account prior known parameters of the camera;
» Analysing the transformed extracted points with respect to the vehicle by determining a deviation Of the points from a line parallel to the vehicle (1 , 41 , 51 , 61 , 71 );
♦ Applying the determined deviation for a calibration of the camera.
2. The online calibration method according to claim 1 whereby the determined deviation is applied as error measure to be minimised when adjusting rotation parameters used for t he cal i rat ί o n of the camera.
3. The online calibration method according to claim 1 whereby extracting points along identified longitudinal road features within a sequence of image frames to be analysed in the transformed virtual road pla view and stored over period of time for a determination of a deviation of the points applied as error measure to be minimised when adjusting rotation parameters used for the calibration of the camera.
4. The online calibration method according to claim 2 or 3 whereby a binary search method is applied when minimising the error measure.
5. The online calibration method according to claim 2 or 3 whereby a nonlinear optimization method is applied when minimising the error measure.
6. The online calibration method according to one of the preceding claims whereby the calibration of the camera is performed about an y-a is transverse to the vehicle and parallel to the road plan and about an z-axjs transverse to the vehicle and perpend icu I ar to the road plan,
7. The online calibration method according to claim 1 whereby longitudinal road features more then a predefined distance away from the car are used for the calibration of the camera about an y-axis transverse to the vehicle and parallel to the road plan. The online calibration method accordin to claim I whereby longitudinal road features less then a predefined distance away from the car are use for the calibration of the camera about an z-axis transverse to the vehicle and perpendicular to the road plan.
The online calibration method according to claim 1 whereby steering angle of the vehicle is used when transforming the extracted points to the virtual road plan view; The online calibration method according to claim 1 whereby analysing the extracted points to be rejected as outlier if fulfil ling some criteria.
The online calibration method according to claim 10 whereby rejecting the extracted points when anaiysin! the transformed extracted points if building a line with an angle curvature greater than a predefined value,
A computer program product for processing data relating to online calibration of a vehicle video system, the computer program product comprising a computer usable medium having computer usable program code embodied therewith, the computer usable program code being configured to perform the steps; of any of the preceding claims 1 to 1 1.
An online calibration system for a vehicle video syaiein, the online calibration system comprising a computer program product for processing data relating to an online calibration method and an image processing apparatus with a camera for taking image frames to be used by the online calibration method such to perform the steps of any o the preceding claims I to 1 1.
PCT/EP2011/056032 2011-04-15 2011-04-15 Online vehicle camera calibration based on road marking extractions WO2012139660A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2011/056032 WO2012139660A1 (en) 2011-04-15 2011-04-15 Online vehicle camera calibration based on road marking extractions

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2011/056032 WO2012139660A1 (en) 2011-04-15 2011-04-15 Online vehicle camera calibration based on road marking extractions

Publications (1)

Publication Number Publication Date
WO2012139660A1 true WO2012139660A1 (en) 2012-10-18

Family

ID=44625991

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2011/056032 WO2012139660A1 (en) 2011-04-15 2011-04-15 Online vehicle camera calibration based on road marking extractions

Country Status (1)

Country Link
WO (1) WO2012139660A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9014904B2 (en) 2004-12-23 2015-04-21 Magna Electronics Inc. Driver assistance system for vehicle
US9205776B2 (en) 2013-05-21 2015-12-08 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US9357208B2 (en) 2011-04-25 2016-05-31 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
US9491450B2 (en) 2011-08-01 2016-11-08 Magna Electronic Inc. Vehicle camera alignment system
US9491451B2 (en) 2011-11-15 2016-11-08 Magna Electronics Inc. Calibration system and method for vehicular surround vision system
US9563951B2 (en) 2013-05-21 2017-02-07 Magna Electronics Inc. Vehicle vision system with targetless camera calibration
US9688200B2 (en) 2013-03-04 2017-06-27 Magna Electronics Inc. Calibration system and method for multi-camera vision system
US9723272B2 (en) 2012-10-05 2017-08-01 Magna Electronics Inc. Multi-camera image stitching calibration system
US9762880B2 (en) 2011-12-09 2017-09-12 Magna Electronics Inc. Vehicle vision system with customized display
US9834153B2 (en) 2011-04-25 2017-12-05 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
US9916660B2 (en) 2015-01-16 2018-03-13 Magna Electronics Inc. Vehicle vision system with calibration algorithm
US10071687B2 (en) 2011-11-28 2018-09-11 Magna Electronics Inc. Vision system for vehicle
WO2018202464A1 (en) 2017-05-03 2018-11-08 Connaught Electronics Ltd. Calibration of a vehicle camera system in vehicle longitudinal direction or vehicle trans-verse direction
US10179543B2 (en) 2013-02-27 2019-01-15 Magna Electronics Inc. Multi-camera dynamic top view vision system
WO2019012004A1 (en) * 2017-07-12 2019-01-17 Connaught Electronics Ltd. Method for determining a spatial uncertainty in images of an environmental area of a motor vehicle, driver assistance system as well as motor vehicle
US10594943B2 (en) 2014-09-30 2020-03-17 Clarion Co., Ltd. Camera calibration device and camera calibration system
US10620000B2 (en) 2015-10-20 2020-04-14 Clarion Co., Ltd. Calibration apparatus, calibration method, and calibration program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050113995A1 (en) * 2002-04-10 2005-05-26 Oyaide Andrew O. Cameras
US20050163343A1 (en) * 2002-12-18 2005-07-28 Aisin Seiki Kabushiki Kaisha Movable body circumstance monitoring apparatus
WO2009027090A2 (en) 2007-08-31 2009-03-05 Valeo Schalter Und Sensoren Gmbh Method and system for online calibration of a video system
US20090290032A1 (en) 2008-05-22 2009-11-26 Gm Global Technology Operations, Inc. Self calibration of extrinsic camera parameters for a vehicle camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050113995A1 (en) * 2002-04-10 2005-05-26 Oyaide Andrew O. Cameras
US20050163343A1 (en) * 2002-12-18 2005-07-28 Aisin Seiki Kabushiki Kaisha Movable body circumstance monitoring apparatus
WO2009027090A2 (en) 2007-08-31 2009-03-05 Valeo Schalter Und Sensoren Gmbh Method and system for online calibration of a video system
US20090290032A1 (en) 2008-05-22 2009-11-26 Gm Global Technology Operations, Inc. Self calibration of extrinsic camera parameters for a vehicle camera

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
TAN S ET AL: "Inverse perspective mapping and optic flow: A calibration method and a quantitative analysis", IMAGE AND VISION COMPUTING, ELSEVIER, GUILDFORD, GB, vol. 24, no. 2, 1 February 2006 (2006-02-01), pages 153 - 165, XP025135439, ISSN: 0262-8856, [retrieved on 20060201], DOI: 10.1016/J.IMAVIS.2005.09.023 *
YAN JIANG ET AL: "Self-calibrated multiple-lane detection system", POSITION LOCATION AND NAVIGATION SYMPOSIUM (PLANS), 2010 IEEE/ION, IEEE, PISCATAWAY, NJ, USA, 4 May 2010 (2010-05-04), pages 1052 - 1056, XP031707108, ISBN: 978-1-4244-5036-7 *

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9014904B2 (en) 2004-12-23 2015-04-21 Magna Electronics Inc. Driver assistance system for vehicle
US9193303B2 (en) 2004-12-23 2015-11-24 Magna Electronics Inc. Driver assistance system for vehicle
US10509972B2 (en) 2004-12-23 2019-12-17 Magna Electronics Inc. Vehicular vision system
US9940528B2 (en) 2004-12-23 2018-04-10 Magna Electronics Inc. Driver assistance system for vehicle
US10640041B2 (en) 2011-04-25 2020-05-05 Magna Electronics Inc. Method for dynamically calibrating vehicular cameras
US9357208B2 (en) 2011-04-25 2016-05-31 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
US10202077B2 (en) 2011-04-25 2019-02-12 Magna Electronics Inc. Method for dynamically calibrating vehicular cameras
US9834153B2 (en) 2011-04-25 2017-12-05 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
US10654423B2 (en) 2011-04-25 2020-05-19 Magna Electronics Inc. Method and system for dynamically ascertaining alignment of vehicular cameras
US9491450B2 (en) 2011-08-01 2016-11-08 Magna Electronic Inc. Vehicle camera alignment system
US9491451B2 (en) 2011-11-15 2016-11-08 Magna Electronics Inc. Calibration system and method for vehicular surround vision system
US10264249B2 (en) 2011-11-15 2019-04-16 Magna Electronics Inc. Calibration system and method for vehicular surround vision system
US10640040B2 (en) 2011-11-28 2020-05-05 Magna Electronics Inc. Vision system for vehicle
US10071687B2 (en) 2011-11-28 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US10099614B2 (en) 2011-11-28 2018-10-16 Magna Electronics Inc. Vision system for vehicle
US9762880B2 (en) 2011-12-09 2017-09-12 Magna Electronics Inc. Vehicle vision system with customized display
US10542244B2 (en) 2011-12-09 2020-01-21 Magna Electronics Inc. Vehicle vision system with customized display
US10129518B2 (en) 2011-12-09 2018-11-13 Magna Electronics Inc. Vehicle vision system with customized display
US10284818B2 (en) 2012-10-05 2019-05-07 Magna Electronics Inc. Multi-camera image stitching calibration system
US9723272B2 (en) 2012-10-05 2017-08-01 Magna Electronics Inc. Multi-camera image stitching calibration system
US10486596B2 (en) 2013-02-27 2019-11-26 Magna Electronics Inc. Multi-camera dynamic top view vision system
US10179543B2 (en) 2013-02-27 2019-01-15 Magna Electronics Inc. Multi-camera dynamic top view vision system
US9688200B2 (en) 2013-03-04 2017-06-27 Magna Electronics Inc. Calibration system and method for multi-camera vision system
US10567748B2 (en) 2013-05-21 2020-02-18 Magna Electronics Inc. Targetless vehicular camera calibration method
US10266115B2 (en) 2013-05-21 2019-04-23 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US9205776B2 (en) 2013-05-21 2015-12-08 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US9563951B2 (en) 2013-05-21 2017-02-07 Magna Electronics Inc. Vehicle vision system with targetless camera calibration
US9979957B2 (en) 2013-05-21 2018-05-22 Magna Electronics Inc. Vehicle vision system with targetless camera calibration
US9701246B2 (en) 2013-05-21 2017-07-11 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US10594943B2 (en) 2014-09-30 2020-03-17 Clarion Co., Ltd. Camera calibration device and camera calibration system
US10235775B2 (en) 2015-01-16 2019-03-19 Magna Electronics Inc. Vehicle vision system with calibration algorithm
US9916660B2 (en) 2015-01-16 2018-03-13 Magna Electronics Inc. Vehicle vision system with calibration algorithm
US10620000B2 (en) 2015-10-20 2020-04-14 Clarion Co., Ltd. Calibration apparatus, calibration method, and calibration program
WO2018202464A1 (en) 2017-05-03 2018-11-08 Connaught Electronics Ltd. Calibration of a vehicle camera system in vehicle longitudinal direction or vehicle trans-verse direction
WO2019012004A1 (en) * 2017-07-12 2019-01-17 Connaught Electronics Ltd. Method for determining a spatial uncertainty in images of an environmental area of a motor vehicle, driver assistance system as well as motor vehicle

Similar Documents

Publication Publication Date Title
US10079975B2 (en) Image distortion correction of a camera with a rolling shutter
US10572744B2 (en) Systems and methods for detecting an object
US10685246B2 (en) Systems and methods for curb detection and pedestrian hazard assessment
US9858639B2 (en) Imaging surface modeling for camera modeling and virtual view synthesis
US9509979B2 (en) Stereo auto-calibration from structure-from-motion
US9959595B2 (en) Dense structure from motion
US10303959B2 (en) Controlling host vehicle based on a predicted state of a parked vehicle
US10081370B2 (en) System for a vehicle
US9443154B2 (en) Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications
KR20190080885A (en) System and method for navigation by car merge and car separation
JP6091759B2 (en) Vehicle surround view system
CN106054174B (en) It is used to cross the fusion method of traffic application using radar and video camera
US10202077B2 (en) Method for dynamically calibrating vehicular cameras
US8461976B2 (en) On-vehicle device and recognition support system
Hecker et al. End-to-end learning of driving models with surround-view cameras and route planners
US8536995B2 (en) Information display apparatus and information display method
CN106054191B (en) Wheel detection and its application in object tracking and sensor registration
US9056630B2 (en) Lane departure sensing method and apparatus using images that surround a vehicle
DE10296593B4 (en) Driving support device
US6812831B2 (en) Vehicle surroundings monitoring apparatus
JP5491235B2 (en) Camera calibration device
US9066085B2 (en) Stereoscopic camera object detection system and method of aligning the same
DE102008031784B4 (en) Method and apparatus for distortion correction and image enhancement of a vehicle rearview system
JP5898475B2 (en) In-vehicle camera system, calibration method thereof, and calibration program thereof
Barth et al. Estimating the driving state of oncoming vehicles from a moving platform using stereo vision

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11716214

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11716214

Country of ref document: EP

Kind code of ref document: A1